AV1 video codec gains broader hardware support
323 points by llcooliovice 1 year ago | 202 comments- AshleysBrain 1 year agoMicrosoft Edge does support AV1, but weirdly only through a Microsoft Store extension [1], even though Chrome has support built-in. This actually really sucks because in practice hardly any normal consumers would bother to install a strangely named extension, and so web developers have to assume it's largely unsupported in Edge. Safari ties support to a hardware decoder, which I suppose is understandable to avoid accidental battery drain from using a software codec, and means eventually in some year's time support can generally be relied upon when enough new hardware is in use. But that won't happen with Edge as things stand!
I think it's high time the web had a single audio and video codec choice that is both open and widely supported, which is why I've proposed support for AV1 and Opus for the Interop 2024 effort [2] [3].
[1] https://apps.microsoft.com/detail/av1-video-extension/9MVZQV...
[2] https://github.com/web-platform-tests/interop/issues/485
[3] https://github.com/web-platform-tests/interop/issues/484
- CrendKing 1 year agoMicrosoft Edge USED to support AV1 through the extension, but decided to disable the support altogether since v116. The "Can I use" website [1] has the up-to-date information on this.
- AshleysBrain 1 year agoHuh, I didn't know that, good point! Even weirder though. Hopefully it's a prelude to built-in support.
- jug 1 year agoThis can be treated as a rumor but I heard a Microsoft developer on Mastodon hint that they had been having trouble with a patent troll, so maybe that's the reason behind the kind of obnoxious workaround too. But that they were dealing with it.
- jug 1 year ago
- ZeroCool2u 1 year agoIs there no explanation as to why they removed support and added it as a MS Store bundle? Seems really strange.
- AshleysBrain 1 year ago
- asabla 1 year ago> ...which is why I've proposed support for AV1 and Opus for the Interop 2024 effort...
A very nice initiative!
Feels like we've been lagging behind on development in this area for some time. And with a wider support (in both hardware and software) more actors should be able to enter this field in the future.
I really hope this gets enough traction in a near future!
- dcgudeman 1 year agoAlthough this thread is about AVIF (the image format based on AV1) it claims that Edge is not supported due to licensing issues:
https://stackoverflow.com/questions/75459594/why-doesnt-edge...
I bet the same licensing issues are also holding back AV1 on edge.
- thomastjeffery 1 year agoYet no information on what the licensing issue is... This is ridiculous.
- baggy_trough 1 year agoPatent trolls probably.
- baggy_trough 1 year ago
- thomastjeffery 1 year ago
- jiggawatts 1 year agoDon’t forget an image codec with HDR (10-bit) support!
Jesus wept, this is just a still frame from any of: h.265, AV1, or VP9. Those, or a JPEG XL file.
None of these work in general, especially outside of the Apple moat.
Oh, sure, you can buy a $4000 flagship Nikon camera that can output HDR files in HEIF format, but they won’t open on Windows and look garbled on Apple devices.
This is so stupid now that the iOS version of Adobe Lightroom can edit HDR photos and export them in three formats… none of which can be viewed as HDR on any iOS device! I’ve never ever seen this kind of retardation before — software that is this fundamentally incompatible with the only OS it runs on!
I go on this rant approximately annually. It’s been about a decade. I expect to be stuck using SDR JPG for another decade at this rate.
- jorvi 1 year ago> Safari ties support to a hardware decoder, which I suppose is understandable to avoid accidental battery drain from using a software codec
I still have a pretty deep dislike for Google for turning on AV1 for everyone in Chromium. It’s the ultimate “fuck you, I care more about my bottom line than your user experience”.
Edit: and clown HN rears its head again. I guess cutting users their battery life to a third is worth it as long as Google saves a little bandwidth?
- AshleysBrain 1 year agoDoesn't the Media Capabilities API [1] provide a way to determine if a codec is "power efficient" (presumably meaning hardware supported)? So then you can switch to another codec if AV1 isn't hardware-supported.
[1] https://developer.mozilla.org/en-US/docs/Web/API/Media_Capab...
- jorvi 1 year agoThat might be, but on (most?) Chromium browsers you have to install h264ify to force-disable AV1/VP9 on devices that have no AV1 hardware decode.
What is even more annoying is that you can only do a full disable. You can’t disable AC1/VP9 video decode but leave image decode intact.
- jorvi 1 year ago
- fireflash38 1 year ago>Edit: and clown HN rears its head again. I guess cutting users their battery life to a third is worth it as long as Google saves a little bandwidth?
Why do you think so much in browsing has been offloaded to users via JS?
- ddalex 1 year agoStop using Google software and services if they annoy you so much. Vote with your browsing habits.
- jorvi 1 year agoAh yes, let me switch to the YouTube competitor that every channel is multi-loading their content to.
- jorvi 1 year ago
- AshleysBrain 1 year ago
- CrendKing 1 year ago
- conjecTech 1 year agoHow much silicon does it take to add an AV1 decoder to a chip? The areas Apple highlighted in their A17 release looked pretty substantial, but I wasnt sure if it was to scale.
https://b3148424.smushcdn.com/3148424/wp-content/uploads/202...
- lxgr 1 year agoI'm pretty sure that a video codec ASIC would share some building blocks across codecs, with per-codec parameterization, so I don't think Apple literally added a single "AV1 box" to the A17/M3 die.
- colonwqbang 1 year agoIt's important to understand that Av1 hardware designs have only recently become available.
In practice a combined Av1/H264/etc. decoder core is most likely. A lot of logic would be shared.
- conjecTech 1 year agoThat makes sense, I'd still like to know what kind of footprint the combined decoder needs.
- conjecTech 1 year ago
- brucethemoose2 1 year ago> How much silicon does it take to add an AV1 decoder to a chip?
The die area is very modest, but the hard part is building it in the first place.
Encoding is more area, but should still be peanuts for Apple SoCs.
- sargun 1 year agoDumb question, is this not a readily available IP block that you can buy? Better yet, why haven't industry partners licensed royalty-free IP for this?
- mastax 1 year agoI'm sure the usual suspects (Synopsys, Broadcom, ARM, Xilinx, etc.) would be happy to license something. But from what I can see all the big players make their own. I guess they're easy enough to implement yourself (as a big player) and important enough to not want to leave it in the hands of a third party.
There are also likely opportunities for additional efficiencies when you make a custom {en,de}coder for your system. I suspect (but haven't confirmed) that the typical Intel/AMD/Nvidia/Apple multi-function media engine isn't just a collection of completely independent encoder/decoder blocks for each codec but a kind of simplified specialized microcoded CPU with a collection of fixed-function blocks which can be shared between different codecs. So it could have blocks that do RGB->YUV conversion, Discrete Cosine Transforms, etc. and you can use the same DCT block for AV1, HEVC, and AVC. Maybe you can also create specialized efficient ways to transfer frames back and forth with the GPU, for sharing cache with the GPU, etc.
- 1 year ago
- mastax 1 year ago
- sargun 1 year ago
- IshKebab 1 year agoThat looks like a pure marketing slide to me. I don't think it would make sense to actually have separate dedicated silicon for these.
- monocasa 1 year agoIt is a pure marketing slide. The M3 variants floorplans don't look anything like that, as can be seen on other pictures of the dies.
That being said, it's pretty common to have dedicated silicon for video codecs. It normally takes the form of a little DSP with custom instructions to accelerate operations specific to the codec.
- conjecTech 1 year agoI agree that is how I'd expect it to be implemented, but I'm not sure how small it would be given the processing bandwidth we are talking about for 4k video.
I'm guessing this is a distinct region of the chip and not integrated with CPU/GPU since they scale up by replicating those blocks and wouldn't want to redundantly place that hardware. Having it separate also allows a team to work on it independently.
I think the relative size of the media engines is accurate in that slide, so then it comes down to how large the ProRes parts are in other chips. They are probably a couple of the unlabeled regions next to the performance cores in the M1 Pro die shot below, but I don't know which.
https://images.anandtech.com/doci/17019/M1PRO.jpg Taken from: https://www.anandtech.com/show/17024/apple-m1-max-performanc...
- marcellus23 1 year ago> The M3 variants floorplans don't look anything like that
Maybe I'm misunderstanding what you're saying, but the slide is of an A17, not an M3 chip.
- IshKebab 1 year agoYes exactly. But are you going to have a different DSP for each codec? Doubtful.
- conjecTech 1 year ago
- asylteltine 1 year agoHow would you do it in hardware otherwise?
- IshKebab 1 year agoWith a DSP that has generic instructions that are especially useful for video codecs.
- wyager 1 year agoYou can either have a fully dedicated core for codecs, or you can just put certain codec related operations (like DCT-related SIMD) in your main cores. Cryptographic acceleration tends to use the latter approach.
- IshKebab 1 year ago
- monocasa 1 year ago
- lxgr 1 year ago
- arbitrarian 1 year agoAccording to the article, it's supported by every browser except Edge. It will be interesting to see who ultimately ends up making a better IE, Safari or Microsoft. So far, it seems Safari is winning, given the ever growing set of standards they don't support, but maybe this is Edge trying to catch up?
- ZeroGravitas 1 year agoThe data at CanIUse.com seems to suggest they're going backward and used to have an installable plugin:
see also:
https://www.thurrott.com/forums/microsoft/windows/thread/you...
Which seems to be claiming the software fallback has been suddenly yanked, temporarily breaking YouTube which fixed it by serving VP9 instead but maybe AV1 hardware decode is still working?
- pjc50 1 year agoOdd given that Edge is Chromium.
- baggy_trough 1 year agoWhat are the "ever growing set of standards" that Safari doesn't support?
- arbitrarian 1 year agoThis was discussed here not too long ago. I assume not much has changed since then. https://news.ycombinator.com/item?id=31902707
- threeseed 1 year agoMost of those that were discussed have been implemented. The new list is here [1].
What you see is that they implement different features with Google obviously wanting the browser to be a full blown OS. So they added things like MIDI support and web pages being able to access the battery.
The problem with many of those features is that they have been shown by researchers to either (a) be insecure or (b) allow web pages to uniquely fingerprint your device. This is obviously an anathema to everything Apple believes in.
[1] https://caniuse.com/?compare=chrome+121,edge+118,safari+17.2...
- baggy_trough 1 year agoSo you don't know, but you assume it's the same, so "ever-growing" is just a uninformed slam? If anything the reality is the opposite.
- threeseed 1 year ago
- asylteltine 1 year agoNothing that is relevant. I use safari across all my devices and I have never had an issue besides very specific things like flashing microcontrollers through a web interface which I had to use chrome for. That’s basically irrelevant for the entire world. Safari is mega efficient and fast and actually cares about your privacy
- alwillis 1 year agoThere was so much hoopla on HN about Chrome abandoning JPEG XL recently while Safari added support for it.
- arbitrarian 1 year ago
- xvilka 1 year ago[flagged]
- mort96 1 year agoSo a few things here:
1. A fair number of users are gonna use the default browser. You probably want to support those users.
2. iOS users can't download non-WebKit browsers. You probably want to support iOS users.
- cptcobalt 1 year agoWhat is a "real" web browser? One would assume its an application that takes web content and parses it for user display so they can use it to browse the web...so Edge and Safari would surely be real, right?
- sp332 1 year agoEdge and Safari (and Firefox) have reader mode, which automatically makes them better than Chrome.
- sixothree 1 year agoBecause they want something less user hostile than Chrome?
- mort96 1 year ago
- ZeroGravitas 1 year ago
- dcgudeman 1 year agoIf anyone is wonder about the Microsoft Edge situation feel free to investigate the "licensing issue" referenced in this StackOverflow thread:
https://stackoverflow.com/questions/75459594/why-doesnt-edge...
Although the thread is about AVIF (the image format based on AV1) I bet the same licensing issues are also holding back AV1 on edge.
- lelag 1 year agoI would add that broader AV1 support is also a good news for low latency application like Cloud gaming or streaming to VR headsets. HW accelerated encoding/decoding improve upon HEVC on every metrics so you can get near real-time streaming with higher quality and lower bandwidth requirements.
- oittaa 1 year agoDoes anyone know if Youtube supports AV1 in their pipeline for ultra low latency streaming?
https://support.google.com/youtube/answer/7444635?hl=en#zipp...
- marcosscriven 1 year agoThis doesn’t match up to my experience playing on Quest 3. Certainly I see better quality for lower bandwidth, but the client side decoding actually adds more latency at the point it actually looks better the HEVC.
- lelag 1 year agoInteresting... Then choosing the best codecs might depends on the specific use case and the amount of motion in particular.
I have a 3080 so I hadn't the chance to test AV1 yet. Is the latency difference significant as reported by VD ?
- lelag 1 year ago
- rollcat 1 year ago> improve upon HEVC on every metrics
Color me the skeptic here, but which benchmark(s) are we talking about? Even h264 vs h265 is not a settled matter - if we truly consider every possible metric, including e.g. SW encoding.
- ZeroGravitas 1 year agoFor software encoding Intel suggest you can beat all the other classic SW encoders with AV1 as long as you have enough cores:
https://d1qg7561fu8ubi.cloudfront.net/blog/BD-rate_fiugure5....
Lower and to the left is "better"
Edit: resolution on that graph image is terrible but they've been sharing it for a while in slide decks so you can probably find better quality by following links from here:
https://networkbuilders.intel.com/blog/svt-av1-enables-highl...
- galad87 1 year agoSVT-AV1 has still a lot of visual quality issues. PSNR, SSIM, VMAF are useful metrics, but optimising for these won't get you the best encoder. x264 didn't get its reputation for going after PSNR and SSIM.
- galad87 1 year ago
- tentacleuno 1 year agoI'm interested as to why it isn't a settled manner. In my experience, H265 files tend to strike really nice compression ratios (in the same vein, after an AV1 transcode, I'm typically left gobsmacked).
(Or were you talking more about latency? In that case I have to defer to someone with more knowledge.)
- rollcat 1 year agoh265 has about 20% lower bitrate than h264 at a very similar perceptible quality, but encoding several variants (adaptive streaming) quickly becomes more taxing on the hardware, and support for decoding h264 in hardware is both more ubiquitous and less expensive. As a concrete example, the 2018 generation of Amazon Fire TV sticks supports h265 but gets really hot, so when an adaptive stream offers both h264 and h265, the Fire TV picks the former. We were experimenting with detecting Fire TV serverside to give it a h265-only HLS manifest (the cost savings on the CDN would be sweet), but ultimately decided against it - the device manufacturer probably had a legitimate reason, be it stability or longevity.
I don't quite understand the industry push for AV1. I appreciate that it's patent-unencumbered, but it makes very little sense from business perspective, as you still need to support h264 and/or h265 for devices that can't decode av1 in hardware (and let's agree that forcing software decoding for video should be criminal). So you add a third codec variant (across several quality tiers) to your stack, cost per minute (encode, storage) goes up, engineering/QA effort goes up... Where's the value? Hence my original question, is AV1 really that much better to justify all that?
- rollcat 1 year ago
- ZeroGravitas 1 year ago
- oittaa 1 year ago
- BoumTAC 1 year agoPerhaps this might seem like a basic question, but why has it taken so long for processors to support AV1, given that it has been out for years?
- repelsteeltje 1 year agoKeep in mind that standards are moving slow, CODEC standards more so. The golden standard is still h264/AVC, which dates back to the nineties. This is primarily due to many appliances (set top boxes, cameras, phones, TVs) using the absolute cheapest hardware stack they can get their hands on.
- pipo234 1 year ago+1 Indeed.
Compared to other standards in streaming media, I'd say that AOMedia has found adoption a lot quicker. h265 (HEVC) was all but DoA until years after it's introduction Apple finally decided to embrace it. It is still by no means ubiquitous, mostly due to patent licensing, which significantly drives up the price of hardware in single digit dollars price range.
Anecdotally, consider that Apple's HTTP Live Streaming protocol (till version 6) relied on MPEG2TS, even though Apple lay the groundwork for ISO/IEC 14496-12 Base Media File Format aka MP4. The reason was that chips in the initial Iphones had only support for h264 using mpeg2 transport streams, and even mp4 -> mp2 transmuxing was considered too resource intensive.
- bscphil 1 year ago> h265 (HEVC) was all but DoA until years after it's introduction Apple finally decided to embrace it
No? You're talking in terms of PC / phone hardware support only. HEVC was first released June 7, 2013. The UHD Blu-ray standard was released less than 3 years later on February 14, 2016 - and it was obvious to everyone in the intervening years that UHD Blu-ray would use HEVC because it needed support for 4k and HDR, both of which HEVC was specifically designed to be compatible with. (Wikipedia says licensing for UHD Blu-ray on the basis of a released spec began in mid 2015.)
- dylan604 1 year ago>Anecdotally, consider that Apple's HTTP Live Streaming protocol (till version 6) relied on MPEG2TS
This sounds like you might be confusing that MPEG2TS might have something to do with the video encoding instead of it solely being the way the video/audio elementary streams are wrapped together into a single contained format. The transport stream was designed specifically for an unreliable streaming type of delivery vs a steady consistent type of source like reading from a dis[c|k]. There is nothing wrong with using a TS stream for HLS that makes it inferior.
- throw0101c 1 year ago> Compared to other standards in streaming media, I'd say that AOMedia has found adoption a lot quicker. h265 (HEVC) was all but DoA until years after it's introduction Apple finally decided to embrace it.
Well, there's now H.266 as well:
- 7speter 1 year agoWas it all but dead because people thought h264 was good enough, until 2.5 and 4k became more prominent in media consumption? It seems really useful if youre doing encoding at resolutions than 1080p, and it makes me less regretful that I have a bunch of recent hardware that didn’t get av1 hardware support :)
- londons_explore 1 year ago> even mp4 -> mp2 transmuxing was considered too resource intensive.
Really? Muxing generally doesn't require one to decode the actual data - merely shuffle around blocks of data with memcpy() so should be really cheap.
- bscphil 1 year ago
- astrange 1 year agoAVC is from 2004-ish, but CPUs weren't good enough to play it in realtime for a few more years.
- colonwqbang 1 year agoThe H265 patent licensing situation is famously a mess and has been a big barrier to adoption. Except in circles where people worry less about that sort of thing: Warez, China, ...
The licensing shenanigans of H265 was a big motivator for creating AV1, a royalty free codec.
- pipo234 1 year ago
- londons_explore 1 year agoDesign cycles of hardware is long.
The person who benefits from a more efficient codec tends to be netflix/youtube (lower bandwidth costs), and they are far far removed from the chipmaker - market forces get very weak at that distance.
- nickpeterson 1 year agoAlso, specialized hardware for a specific format feels like a really good way to have a useless part of a cpu if the format stops being used.
- imtringued 1 year agoPeople never stopped using VP8. In fact, your screen sharing is probably wasting excessive amounts of CPU every day because there is no hardware support.
- imtringued 1 year ago
- kccqzy 1 year agoYouTube makes their own chips: https://www.protocol.com/amp/youtube-custom-chips-argos-asic...
Netflix doesn't benefit since their catalog is orders of magnitude smaller.
- nickpeterson 1 year ago
- ZeroGravitas 1 year agoAV1 isn't particularly behind schedule compared with previous codec generstions. We could and should have moved faster if everything went well but Qualcomm in particular were being awkward about IP issues.
Luckily the effort behind the David software codecs kept up the rollout momentum.
- cubefox 1 year agoThe AV1 ASIC takes up space on the SoC, so effectively it decreases the performance of other parts. This could be why some manufacturers have delayed including support for quite a while. Though Mediatek already had AV1 support three years ago.
- ksec 1 year agoEspecially when Google and AOM promise to have a hardware encoder and decoder to be given out for free by 2018, and be implemented in many SoC by 2019, wide availability support and AV2 by 2020.
Well the basic answer is that, making an efficient hardware encoder and decoder, within power budget and die space, all while conforming to standard because you wont have much of a chance to correct it, and implementing it into the SoC design cycle which is and always has been at least three years minimum, is a lot harder than most software engineer at Google and AOM would thought.
- fomine3 1 year agoTwo big SoC manufacturers, Apple and Qualcomm, have patents for competitors like HEVC
- llm_nerd 1 year agoApple has a tiny sliver of the patents in HEVC, and while we don't have the numbers I feel pretty certain they pay far more into the pool to ship HEVC in their devices than they get out of it. The same is doubly true of Qualcomm who aren't even a part of the pool.
HEVC was finalized in 2013. AV1 was finalized in 2018, and has just finally started getting a robust ecosystem of software and hardware.
- orra 1 year agoThat's important context! Adoption of HEVC was so slow that I honestly thought it was released around the same time as AV1.
- fomine3 1 year agoIn context of format war, winning means that they earn patent fee from every device. I don't think it's Apple's intension, but possibly Qualcomm's.
- orra 1 year ago
- llm_nerd 1 year ago
- dist-epoch 1 year agoChicken & egg problem.
No AV1 videos -> no pressure to add hardware support -> difficult to justify encoding videos in AV1
- FullyFunctional 1 year agoNetflix (and Youtube? I forget) will push an AV1 stream if you have the support. This was even mentioned in Apple's show yesterday. So the egg is already there and the chicken is slowly coming, thankfully.
- entropicdrifter 1 year agoYouTube was the first to support it. They even went to war with Roku over it and Roku killed the YouTube TV app in retaliation to YouTube's mandate that all next-gen devices support AV1, so YouTube went ahead and embedded it inside the regular YouTube app.
Roku's latest devices to support AV1, so I guess either the price came down, they struck a deal, or Roku just lost to the market pressure after Netflix pushed for AV1 as well.
- entropicdrifter 1 year ago
- 7speter 1 year agoI think a lot of content creators really want AV1 because of the drastic reduction of file sizes. Streaming companies want it to catch on because of the drastic reduction in bandwidth use.
- bee_rider 1 year agoI thought Google was the main one behind AV1. Couldn’t they use their position as one of the world’s biggest video platforms to break that chicken egg loop?
- alex_duf 1 year agoThe article mentions Android 14 requiring AV1 hardware support, so yes.
- troupo 1 year agoThey have. They literally threatened to pull their support from devices if they don't implement the codec in hardware. Roku's spat with Google was a big-ish story when that happened.
I don't know how that can be viewed as a good thing.
- lxgr 1 year agoYouTube has been supporting AV1 for a while now.
- entropicdrifter 1 year agoThey have been. YouTube was the first big platform to push for AV1
- alex_duf 1 year ago
- FullyFunctional 1 year ago
- brucethemoose2 1 year agoIts Apple and Qualcomm that have been slow.
Intel, AMD, Nvidia, and other ARM chipmakers for phones, TVs, streaming sticks and such were quicker to pick it up.
- AzzyHN 1 year agoSame reason why you'd write code with Java 8 or C89
- baybal2 1 year ago[dead]
- galad87 1 year agoExisting codecs are good enough in most cases.
- repelsteeltje 1 year ago
- RicoElectrico 1 year agoIs it really _that_ hard to create a generic video decoding DSP whose firmware could be updated? Most codecs are very similar to each other. IIRC Texas Instruments used multicore DSP to decode MPEG back in the 90s.
Or maybe we should have written codecs to be amenable towards GPU shader implementation...
- wmf 1 year agoThe problem with a generic codec DSP is how fast do you make it? Newer codecs often require twice as much computation as older ones, so do you make the DSP twice as fast as you need today and hope that's enough to run future codecs? Meanwhile you're wasting transistors on a DSP that won't be fully used until years later.
To some extent the PS3 did this; the Cell SPEs were fast enough to implement Blu-ray and streaming video playback in software and they made several updates over the life of the PS3.
- kllrnohj 1 year ago> Or maybe we should have written codecs to be amenable towards GPU shader implementation...
They are, but programmable GPU shaders are nearly always more power expensive than fixed function purpose specific silicon. It's why many key aspects of GPUs are still fixed function, in fact, including triangle rasteriziation and texture sampling/filtering.
- brucethemoose2 1 year agoAOM already wrote a GPU shader-assisted decoder for the Xbox One:
- bee_rider 1 year agoI wonder if encode could run on the iGPU?
I think, at least, that one of the biggest use-cases for encode is game streamers (is this right?), they should have decent dGPUs anyway, so their iGPU is just sitting there.
- brucethemoose2 1 year agoElemental wrote a GPU shader h264 encoder for the Radeon 5870 back in the day, marketed towards broadcasters who needed quality and throughput: https://www.anandtech.com/show/2586
Intel used to write hybrid encoders (that used some fixed function and some iGPU shader) for their older iGPUs.
So the answer is yes... if you can fund the right group. But video encoders are hard. The kind of crack developer teams that can pull this off don't grow on trees.
- astrange 1 year agoShaders have little benefit for anything with "compression" in the name. (De)compression is maximally serial/unpredictable because if any of it is predictable, it's not compressed enough.
People used to want to write them because they thought GPU=fast and shaders=GPU, but this is just evidence that almost noone knows how to write a video codec.
- astrange 1 year ago
- unleaded 1 year ago
- brucethemoose2 1 year ago
- wmf 1 year ago
- dannyw 1 year agoIs AV1 better than HEVC? Or is it about the same, just freely licensed?
- Strom 1 year agoWe have to make a distinction between the standard and implementations. AV1 has open source implementations which achieve really high compression. HEVC also has implementations which achieve really high compression and do that faster than AV1, but all the good ones are paid like MediaConcept. [1] The open source HEVC implementations (i.e. x265) are unfortunately quite weak compared to their AV1 counterparts and do not achieve comparable compression.
So yes, the answer depends most on whether you care about licensing. Both in terms of royalities and also implementations.
--
[1] https://www.mainconcept.com/hevc - for the casual user most easily accessed by purchasing Adobe Media Encoder
- dannyw 1 year agoVery insightful, but is x265 really that bad? If you’re willing to wait, does slower presets help?
I tested NVENC, X265, and DaVinci Resolve Studio’s H265 encoder.
x265 was best by far. What more am I missing on?
- Strom 1 year agoYes x265 is not in the conversation for even top 5 HEVC encoders, regardless of presets. [1] It can be especially surprising because x264 is easily the best AVC encoder. For whatever reason (patents, lack of browser support etc) there just hasn't been as much engineering effort put into x265.
Now things like NVENC are even worse in terms of compression. Any GPU accelerated encoder trades compression efficiency for speed. Even x265 with the slowest presets will demolish any GPU encoder in terms of compression, including the MainConcept paid one when it's run in GPU-accelerated mode. This is unfortunately not explained in GUIs like Adobe tools. They just have a checkbox or dropdown to select GPU acceleration, but don't mention that it's not just acceleration - a different algorithm will be used that can't achieve the best compression.
GPU accelerated compression can still be very useful for scenarios where you need speed (e.g. have a deadline) or just don't care about quality (e.g. will publish it only on social media where it will be recompressed anyway). However when you have time to wait and want top quality, the slowest CPU-only code path will always win.
--
[1] One good public resource is the Moscow State University page http://www.compression.ru/video/codec_comparison/index_en.ht... -- They do regular comparisons of various codecs and some results are available for free. A bit in HTML form and more in some of the PDFs. Deeper insights are unfortunately paid.
- Strom 1 year ago
- dannyw 1 year ago
- windsurfer 1 year agoHEVC isn't widely supported by web browsers: https://caniuse.com/hevc
AV1 is supported by almost all web browsers: https://caniuse.com/av1
- frakkingcylons 1 year agoNotable exceptions are Safari and Safari on iOS (unless you have an iPhone 15 Pro).
- alwillis 1 year agoOr an M3 Mac.
- alwillis 1 year ago
- threeseed 1 year agoMost of those browsers are just skins over the Chromium engine.
And iOS Safari is arguably the most important browser at all since iPhones are disproportionally used amongst video content creators.
- frakkingcylons 1 year ago
- Vrondi 1 year agoHEVC is great for giving you quality in a smaller file size, for your local video files (due to being supported by more cameras and phones so far). AV1 has wider support for streaming video online at a higher quality (due to more browser support so far). So, at the moment, they are really best in two different use cases (subject to change, of course).
- Isthatablackgsd 1 year ago> HEVC is great for giving you quality in a smaller file size, for your local video files.
This is a huge plus in fansub anime scene. 10 years ago, majority of anime are in H264 (720p/1080p 8bit) which is normally 1±GB for each episode that are consist of 25 min. If I want to watch one anime, it will consume about 20 GB of space. Now, majority of them are in HEVC (1080p 10bit) which are about 300± MB for each episode.
- Saoshyant 1 year agoIf you actually look up AV1 releases like Trix, the file sizes are even smaller, while keeping up the same quality.
- Saoshyant 1 year ago
- Isthatablackgsd 1 year ago
- FroshKiller 1 year agoI don't think you can reduce the comparison to "better than," but AV1 certainly compresses to smaller files than HEVC. Maybe that's a better fit for your use case.
- Strom 1 year ago
- caskstrength 1 year agoGlad to see Amazon is participating. I would love to have much higher Twitch stream quality with similar bitrate.
- dishsoap 1 year agoI would love to see them triple the max bitrate cap from 1 MB/s to 3 or so
- dishsoap 1 year ago
- llcooliovice 1 year agoNow supported by the Apple M3 chips.
- FirmwareBurner 1 year agoOnly decode, not encode. Intel ARC GPUs also do encode.
- whynotminot 1 year agoI think it makes sense for companies to start with decode though. That hits pretty much 100% of users--everyone watches video.
But only a small fraction of users actually create content and need accelerated encode. And Apple especially I think is unlikely to use AV1 for their video recording, given their investment in other formats for that use-case.
- repelsteeltje 1 year ago> And Apple especially I think is unlikely to use AV1 for their video recording, given their investment in other formats for that use-case.
I concur. The raison d'être for AV1 is (lack of) patent license royalties. These apply to user devices as well as services. Think Google: Android as well as YouTube cost fortunes in AVC/HEVC licenses, so here AV1 makes sense.
On the other hand, Apple sells expensive hardware and has no problem ponying those licenses. Soon after adopting HEVC they doubled down with Dolby Vision which technically adds very little on top of standard HDR features already available in HEVC and AVC but present serious interop problems for device come with shiny Dolby stickers.
- kevincox 1 year agoPlus unless you are streaming or producing a ton of video most users can afford to wait a bit for software encoding (which is often better quality as well). So encoding is far less important than decoding.
- repelsteeltje 1 year ago
- adrian_b 1 year agoAlso the AMD Phoenix CPUs for laptops (7640/7840/7940), the AMD RDNA 3 GPUs and the NVIDIA RTX 4000 GPUs support AV1 encoding.
- my123 1 year agoThe Snapdragon 8 Gen 3 and X Elite have AV1 encoding too
- my123 1 year ago
- lxgr 1 year agoAs far as I can tell, Apple has always only supported decoding for non-MPEG codecs.
And their encoders (at least on macOS in the past) usually don’t yield results comparable to software or dedicated quality-optimized encoding ASICs, so if I wanted high quality at low bitrates I’d have to reencode offline anyway.
It would be nice to have it available for video conferencing or game streaming, though.
- whynotminot 1 year ago
- FirmwareBurner 1 year ago
- 3cats-in-a-coat 1 year agoWhat was the situation on its efficiency compared to H.265 and similar?
- repelsteeltje 1 year agoIts in the same ball park. Both do considerably better than AVC (h264), but many direct comparisons between HEVC (h265) and AV1 compare apples to oranges. Sure you can get 30% lower bitrate, but only at degraded quality levels or higher decode complexity.
Also note that HEVC had a considerable head start (5 years?) so performant encoder (or even energy efficient decoders) took a while to catch up. Recent ffmpeg versions offer a lot of options, you'll find that even a basic comparison is PhD-level difficult ;-)
- bscphil 1 year ago> Sure you can get 30% lower bitrate, but only at degraded quality levels or higher decode complexity.
Thank you for pointing this out. This thread is a mess of claims at the moment because this simple fact is under-recognized.
There are two accepted ways to compare codecs+settings: either (a) you perform a subjective comparison with the human eye using the same bitrate for both codecs, or (b) perform an "objective" metrics-based comparison where you match measured quality and compare the ratio of the bitrates.
If you're looking only at 1080p SDR 8-bit video, even h264 is already commonly used at bitrates that can approach transparency to the source (visually lossless to the human eye) when encoded well. For example, a typical Blu-ray bitrate of ~30 Mbps can achieve transparency when well-encoded for most sources.
The reason measures like "30%" are misleading is that if you try to match h264 performance at these high bitrates, you won't get anything close to 30% improvement (with HEVC over h264, or AV1 over HEVC). It can be negligible in a lot of cases. In other words, the improvement ratio from increasing the complexity of your media codec depends on the target quality of the encodes used in the test.
AV1 achieves significant improvements ("30%") over HEVC only at the lowest qualities, think YouTube or Twitch streaming. At high bitrates, e.g. something acceptable for watching a movie, the improvement can be much less or even insignificant, and at near-transparency a lot of AV1 encoders actually seem to introduce artifacts that are hard to eliminate. AV1 seems heavily optimized for the typical streaming range of bitrates, and claims about its supposed improvement over HEVC need to be understood in that context.
- bscphil 1 year ago
- tagrun 1 year agoDepends on the encoder, this website provides easy-to-visualize data sets for various encoders at various settings https://arewecompressedyet.com/ AV1 encoders tend to have better VMAF score at a given bits-per-pixel.
- furyg3 1 year agoIt's about 30% more efficient.
- 3cats-in-a-coat 1 year agoWow that was unexpected. I checked online and it does say production encoders are faster and the result is somewhat smaller (for same quality). What a time to be alive.
- 3cats-in-a-coat 1 year ago
- repelsteeltje 1 year ago
- botanical 1 year agoI don't know what's up with AV1 software decoding but it uses considerably more CPU than HEVC or VP9
- rbultje 1 year agoAV1 is more complex. Prediction can be more complex with things like "combined inter/intra" or "warped motion" or "overlapping block MC" compared to HEVC/VP9. Then there's additional postfilters like loop restoration, cdef and film grain that didn't exist in VP9 (just deblock - which also exists in AV1) or HEVC (deblock + sao). Entropy coding is more expensive than VP9 with per-symbol entropy updates (which HEVC also has). Bigger blocks are probably an overall win, but bigger transforms can be painful with many non-zero coefficients. And intermediates in 2D MC and bi-directional ("compound") prediction are 12-bit instead of 8-bit in VP9 for SDR/8bit video. This is more similar to HEVC. So overall, AV1 > HEVC > VP9 in terms of runtime complexity, this is expected, nothing you can do about it.
- dishsoap 1 year agoIt looks like dav1d is (or was 2 years ago, maybe it's even faster now) on par with ffmpeg's hevc decoder in that aspect: https://youtu.be/wkZ4KfZ7x1k?t=656
You're right about VP9 though, definitely faster to decode, though there are trade-offs (video quality, encoding performance) when compared to HEVC.
- ReactiveJelly 1 year agoI wonder if it's just because it's new. I remember years ago the first 1.x versions of libopus used way more CPU than Vorbis to decode, now they're comparable. (This was on a teeny little chip where decoding 1 Vorbis stream took a measurable amount of CPU)
- rbultje 1 year ago
- sergiotapia 1 year agoI love AV1 for compressing my movies to 720p. I also convert any audio to Opus. I get to archive a ton of content on peanuts. The videos look great on my PC monitor or my phone and they come in at around 452MB (1hr 24m video).
Here's my script if you're interested in trying it out on your content.
And I just invoke it against a folder to recursively convert stuff.param ( [Parameter(Mandatory=$true)] [string]$sourceDir, [string]$destDir = $sourceDir ) $ffmpegPath = 'C:\Users\sergi\Downloads\ffmpeg.exe' Write-Output "Starting conversion..." Get-ChildItem $sourceDir -Include *.mp4,*.avi,*.mov,*.wmv,*.flv,*.webm,*.mkv -Recurse | ForEach-Object { $newFileName = $_.BaseName + '-av1-720p' + $_.Extension $destPath = Join-Path $_.Directory.FullName $newFileName Write-Output "Converting $($_.FullName) to 720p AV1..." & $ffmpegPath -i $_.FullName -vf scale=1280:720 -c:v libsvtav1 -crf 30 -preset 7 -c:a libopus -b:a 96k -ac 2 $destPath } Write-Output "Conversion complete."
As soon as there's something that can decode AV1 that's like an nvidia shield I will replace both of my shields. So far nothing like that exists to my knowledge. Even Roku 4k Pro's say "AV1" support in their spec but they still trigger transcoding on plex when doing a playback..\av1-convert.ps1 -sourceDir 'D:\Movies to convert\'
- kidfiji 1 year agoAs someone who also has a 2019 Shield TV Pro and is waiting for the "next best thing", one resource I've been keeping my eye on is androidtv-guide.com:
https://www.androidtv-guide.com/streaming-gaming/av1-android...
- kibwen 1 year agoWhat sort of file sizes are you getting for 720p video in AV1? Are there any other relevant parameters that you tweak, e.g. framerate?
- sergiotapia 1 year agoIt's there in the comment, both my filesize and the command verbatim. I don't do anything else other than what's in the powershell script.
- kibwen 1 year agoThanks, not sure how I glazed over that. :)
- kibwen 1 year ago
- sergiotapia 1 year ago
- mvanbaak 1 year agowait, you convert to 720p and want to play that using a shield pro type of device. This might be ok in your current setup, but as soon as you upgrade the panel to 1080p or 2160p, you would want the source to be in at the same resolution, or better.
- sergiotapia 1 year agoI'm aight to be honest. MPV is stuck to the side of one of my monitors. I don't need more resolution at all.
On my shield's I play 4k remux's from a plexshare.
- sergiotapia 1 year ago
- chungy 1 year agoI don't recognize that language. Is it PHP?
- tredre3 1 year agoIt's Powershell. It's Windows' Bash equivalent (admittedly much more advanced) but it's been open-sourced and ported to Linux as well.
- 1 year ago
- tredre3 1 year ago
- kidfiji 1 year ago
- atlgator 1 year agoAV1 hardware support is great and all, but what streaming services actually support it? Twitch did a pilot back in 2020 and the video quality was fantastic. They still haven't rolled it out.
- Chris_Newton 1 year agoPossibly worth noting that the encoding speeds for AV1 have improved out of all recognition over the past few years. Depending on what Twitch were using, even back in 2020 it might have been orders of magnitude slower than encoding other relatively modern formats. Today that is no longer true and in some circumstances encoding AV1 seems to be faster than almost anything else since H264. So if hardware decoding is also improving, it’s certainly possible that more services could potentially use AV1 now or in the near future.
(Source for the above is just some personal experiments. I happened to be doing a load of benchmarking with video codecs this weekend, as we’re hoping to start using AV1 for a website I run.)
- Chris_Newton 1 year ago
- 2OEH8eoCRo0 1 year agoIs this the new hardware treadmill? Every few years we need to switch codecs for minor bandwidth/quality gains to nudge us to buy new hardware that supports it?
- andrewstuart 1 year agoSafari does not support AV1 playback.
- alwillis 1 year ago> Safari does not support AV1 playback.
It will on M3 Macs and iPhone 15 Pro and Pro Max.
- FoxBJK 1 year agoI have a 15 Pro. AV1 playback is not working in Safari. I'm trying with this URL: https://bitmovin.com/demos/av1/
- FoxBJK 1 year ago
- alwillis 1 year ago
- m3kw9 1 year agoWhat’s so special about AV1?
- throw0101c 1 year agoCompared to H.264/AVC, you can get the same level of video quality in ~half the bandwidth (or double your quality for the same bandwidth).
Compared to H.265/HEVC, AV1 has no patents and so anyone can implement it without worrying about licensing (there seem to 3+ groups that need to be paid off).
The trade-off is that it is more computation intensive than H.264 (as is H.265).
- throw0101c 1 year agoSelf-reply:
Seems that there's now also H.266/VVC:
* https://en.wikipedia.org/wiki/Versatile_Video_Coding
Short 2023 article on H.266 versus AV1:
* https://www.winxdvd.com/video-transcoder/h266-vvc-vs-av1.htm
- throw0101c 1 year ago
- ZeroGravitas 1 year ago* No patent fees.
* Wide browser support.
* High performance open source software decode available on x64 and ARM
* High performance open source software encode available, tuned for multicore cloud encoding
* default support for film grain emulation which alone can save about 30% bandwidth when content requires it.
- tizio13 1 year agoQuite a lot actually. This codec is much more efficient at producing high quality video recording / streaming at much lower than normal bit rates when comparing to x264/5. Epos Vox has a good video describing the benefits: https://youtu.be/sUyiqvNvXiQ
- supertrope 1 year agoH.264's most likely successor was HEVC. While Google and Mozilla strongly prefer VP8/VP9, most video content distributors are okay with paying the license fee for H.264. One patent pool, one license fee. HEVC's patent pool fragmented. So even after you pay one fee there might be another one or even worse patent trolling litigation. So non-broadcast companies are adopting av1 to avoid using HEVC when possible.
- 1 year ago
- throw0101c 1 year ago
- toastal 1 year agoCan’t wait to see broader JPEG XL hardware support
- Andrew018 1 year ago[dead]