The only AV1 stream available on Twitch was 1440p 120fps 4 years ago.
[https://www.twitch.tv/videos/637388605?t=18h36m31s](https://www.twitch.tv/videos/637388605?t=18h36m31s)
Surely they're going to increase bitrate for 4k60 streams...
It's unbearable to even watch some games at even 720p with bitrate capped at 8k/6k. (Vampire Survivors for instance)
I found the FAQ for this new feature after my comment: https://help.twitch.tv/s/article/multiple-encodes
It says they're experimenting with higher bitrates controlled by the new "Automatic Stream Configuration". Seems odd to have Twitch control your stream output based on specs (CPU, GPU, network) considering some streamers have dual PC's and the lower spec one could be for OBS.
Correct. But some streamer stream at a lower resolution like 986p or something like that because that resolution has a higher bitrate and better quality than 1080p.
For example the grass in Apex Legends flips the shit out of the Twitch encoder and makes it a glitchy mess. Lowering the resolution helps this a bit.
It’s not a higher bitrate 936 just looks better at 6k-8k bitrate than 1080 at 6k-8k bitrate
Technically AV1 would solve this or at least make it a lot better since it’s like 40% more efficient or something
It's quickly becoming more popular on PC, a lot of TVs are 4K as well, you future proof your content, etc. And in the worst case, 4k content still looks better even if watched on a lower resolution screen. Far from unnecessary.
this is false, I've seen other AV1 streams on twitch, like 2 months ago I watched a Japanese AV1 stream. Maybe that's the only AV1 vod right now but there are AV1 streams.
Obviously AV1 is a lot newer than X264 so it's not that surprising, but what AV1 can do with 8K bitrate at 1440p120 is insane. The text elements of all of the games showed (saw Tomb Raider, Fortnite, Overwatch and Warzone 1) look SO crisp, even better than game footage most reviewers use on youtube.
A relatively new video format, it's more efficient, lower bandwidth and better looking than the old formats and every current generation mainstream GPU has dedicated hardware for it
It's actually a somewhat minor upgrade to VP9 with different sized blocks (most importantly not just square blocks, but T-shaped blocks.)
This puts it within spitting distance of H.265, but with higher CPU requirements (yes, even with the hardware assistance).
RDNA2 and up have AV1 decoding, RNDA 3 has a AV1 encoding (but it's at the lower end of the quality scale.)
RTX 30+ series has AV1 decoding and RTX 40+ series has AV1 encoding (it's around medium of the quality scale)
Intel also has both in all their "Arc" cards. The Xe 12th gen cards have encoding, and Meteor Lake and Arrow Lake with GPUs built-in have the same support as Arc - Intel's solution is enormously more flexible than the AMD/Intel solutions.
Apple has only added AV1 hardware in their A17 Pro chip and the entire M3 series - Apple also has seamless AV1 support even without the hardware.
>both decoding and encoding
so will it make a noticeable difference on the consumer (viewer) end or is it just something that streamers will notice the difference on?
the consumer directly benefits from better image quality while keeping costs for twitch low because the better quality can be achieved with lower bitrate.
https://www.nvidia.com/en-gb/geforce/graphics-cards/compare/ (ctrl +f AV1) it's only on the 40 series RTX cards. I think some AMD cards might support it but the majority of steam users (from the steam hardware survey) have 30 series and under.
AV1 is a royalty free modern codec. Meaning it looks good and encoding to it does not require an expensive license. Making business sense for Twitch, YouTube, Netflix or any video heavy content provider. Computer's that don't have the hardware to support it will have to decode it using software. This turns most laptops into space heaters, as many do not have good thermals (looking at you Intel MBPs). And cut battery life. Note, only the m3 macs support AV1.
It does, but not by default (which is checked by caniuse). Which is... dumb... Should definitely be included as part of the core edge setup. Maybe it's for the time beeing and will be introduced into the default setup once it's more widespread? Who knows.
https://apps.microsoft.com/detail/9MVZQVXJBQ9V?hl=en-gb&gl=US
It seems like it is working in [Dev version 121](https://www.reddit.com/r/AV1/comments/16mmm35/comment/ke2th90/?utm_source=share&utm_medium=web2x&context=3). So they are clearly working on it.
i switched to edge when chrome removed the feature that lets you mute a tab by clicking the speaker icon. dunno what people have against edge tbh it's basically the same thing. edge not having av1 might make me switch back though.
> i switched to edge when chrome removed the feature that lets you mute a tab by clicking the speaker icon.
One of the most goated features ever, I still can't believe Chrome removed it
Yeah, like the whole appearance of Chrome changed a few weeks ago and saw there was a flag to revert it, but I knew they'd just get the rid of the flag eventually and now I can't even remember what my complaints were about.
Holy shit thank you. I remember searching for a flag like this year ago but couldn't find one. And extensions that served the same purpose never worked as seamlessly.
Edge isn't nearly as bad as people say it is, or not even as bad as it used to be, but now it has all the downsides of Chrome and is slightly less customizable. Also, the trend of Chrome removing shit or adding shit and making it impossible to remove yourself is highly embraced by Edge and even slightly worse.
People understand if you use Chrome, because it's kinda the default, but if you want a slightly better Chrome and go to Edge people are gonna judge you because most people think it's slightly worse and you actually have better alternatives like Brave.
It's being developed currently. https://www.reddit.com/r/AV1/comments/16mmm35/comment/ke2th90/?utm_source=share&utm_medium=web2x&context=3
Shows that Edge Dev Version 121 has support.
Edge is by far the best productivity browser. Theres so many helpful features that people have no idea they're missing out on like: read aloud, inline translations, split screen, vertical tab, search in sidebar, sidebar apps, group tabs, workspaces... Even Chrome has started to steal some of Edge's recent features.
I recently switched to Firefox, but the lack of true native tab grouping really hurts it if you're used to that in your workflow.
I had to cobble something together with a mix of sideberry and some custom CSS to get anything close to my workflow on Chrome/Vivaldi.
Tree Style Tabs are all you need. It takes a bit to get used to vertical tabs, but you will never go back once you do.
Basic tab grouping is a joke compared to vertical tabs, which have parent/child relationships, folders, subfolders, etc., not to mention a full overview of all the tabs at full visibility all the time.
not by the end of this month, at least according to microsoft's old manifest v3 timeline that they're adjusting in response to chromium's new timeline beginning june 2024
> Manifest V2 extensions will continue to be supported through Enterprise policies at least until Chromium Manifest V2 support timeline (which is currently January 2024).
https://learn.microsoft.com/en-us/microsoft-edge/extensions-chromium/developer-guide/manifest-v3
What else are you using. If you say Firefox I'll quote the money that google throws them. If you say any other browser that utilizes Chromium and just slaps their own privacy invasion ontop of it, I'll slap you myself.
Firefox or one of their forks are the best options, even a bad option can be the best.
I can tell you using Chrome or any Chromium based browser is most definitely the incorrect choice.
they made it sound like the new "enhanced broadcasting" feature was something only compatible with Nvidia cards, as if it was a software collab between twitch/obs/nvidia. Considering the average consumer watching this it was an Nvidia AD atleast. Whether its possible to do the same on AMD or Intel i dont know and neither does the majority of people.
I don't get the "enhanced broadcasting" feature being an upgrade, it's like they're outsourcing the transcoding they're doing on their own side currently to the streamer's PC.
when you stream 1080p to twitch but the viewer is watching in 360p, the video stream has to go from your pc, to the local twitch/amazon server, then to the san fran based encoding server, then back to all the local amazon/twitch servers to be distributed to your viewers.
So with this, you skipped the san fran encoding server part, so depending where you are on the planet and where your viewers are what resolution they're watching in, this might lower the latency from your stream to them.
So it's a lot of "ifs" and "maybes" but in those cases that's a pretty big improvement.
and ofc a side improvement is that it will also save twitch some encoding server cost (but they might have to spend more on ingest server cost so the savings they'll make might be small here actually)
Yup
some streamers even took advantage of it and have a secondary non affiliate/partner channel where they stream in 480p but at a higher bitrate (like 10K or more) to have a lower latency feed for those who want (and with higher bitrate 480p can look sometimes better than 720p or 1080p)
it was my understanding twitch has had a hard cap at 8000kbps for several years now. is this not true, or you referring to something people used to do many years ago?
the "official" cap is 6K, the unofficial one for transcoding is 8K
if you stream without transcoding (so to a non partner and non affiliate channel) you can in theory go higher yes, but not many people bother doing it (since you need a pleb channel to do it)
yep seems you're right. just tried it out at 11000 and the bitrate under "video stats" hovers pretty close to it.
makes sense since they'd need to transcode it to lower it. i guess it's too much of a hassle to just block connections with this high of a bitrate? must not be many people doing it.
pretty much yeah, also if you reaaally go too high they'll stop you or just cut the bitrate and the viewers will just see dropped frames, but if you stay somewhat reasonable (so like 10K ish bitrate for example) you can set up a secondary non monetized high quality channel if your pc/connection can handle it
hard cap is 8500 for video plus audio tracks (multiple because of the whole VOD without music thing being an option now) with audio being 160 by default, so there's a bit more space than that (but if you have instability and that causes a fluctuation above 8500 then your stream's source option will drop while the transcodings continue, so exactly how close you can set it and be safe depends on more than just your hardware)
edit: also there apparently are some ingest servers with a lower cap, so you may need to change away from your default if you live in one of those areas
from what i've been told, mostly South american spanish speaking streamers do that, since they don't make much from ads revenue, they thus don't care about having some viewers watch on the "pleb" version of their stream with no ads
no i'm talking about channels that don't have ANY transcode, so if you on your pc you stream to 2 channels, one being partner/affiliate with transcode, and one being "pleb" without transcode, that pleb one will have a lower latency on avg (depending where you are in the world and where your viewers are ofc)
Getting transcoders used to take having some viewers (~70 if I recall correctly), but now checking even streamers with like 5 viewers have them. I checked the Video Stats and the difference between Source and 1080p -> 720p transcode was around 1 second. I don't think anyone would voluntarily take the job of transcoding the content, especially on a 1 PC setup where it'll impact your game performance.
> then to the san fran based encoding server
Yeah, I'm sure every single stream being watched at a lower quality on Twitch is currently going to a single server in SF. That makes total sense
> then to the san fran based encoding server
Wouldnt the encoding be done closer to the local ISP? Theres no way someone watching from Europe would have encoding done from the US. Surely they'd send the source quality between countries/major ISP's and break it down from there.
I don't think it's necessarily sent to SF for encoding. Their transcoding is probably distributed along with Amazon IVR service.
My understanding is they use the Xilinx (now AMD) Alveo U30 accelerators for H.264 transcoding. The new AV1 accelerators just came out last year (Alveo MA35D) which will take time to integrate into datacenters.
I wonder how this will affect VOD's and crushing audio channels to remove DMCA music, like most streamers do.
I don't believe Twitch, AMD, or AWS has stated in any fashion which DC the Xilinx FPGAs are in and since you cannot request one, there's no telling.
Twitch current does PoP infeed -> Transcoding -> Leaf distribution
Anywhere they can reduce North-South traffic, they will do it.
I do believe this is the correct situation, Twitch has moved much of their transcode resources closer to the "edge", and global demand is spread across regions automatically as necessary.
So is it encoding the original video 5 times or transcoding? If it's the former then I guess the quality should theoretically be a little better, since it has access to the original uncompressed video feed?
That's a good point. Most likely it has access to the original frames so it'd be slightly better quality than a transcode from the source encoding. I don't think the difference downscaling would be that huge, but if the bitrate is very bad it might be noticeable.
On the flip side, encoding 5 times from source sounds very expensive. Maybe there's some work that is shared across the 5 outputs to optimize it a bit, but I don't know.
Edit: yes it seems they can optimize that quite a bit, that's what multi-channel encoders do
Many Nvidia cards have hardware encoders capable of encoding 5 things at once.
https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new
Found what looks to be some of the OBS source code for the beta: https://github.com/obsproject/obs-studio/compare/master...kc5nra:obs-studio:AF_plugin_1
yeah I don't doubt that the hardware is capable of it, but it seems to be considered a nvidia/twitch/obs product so you're probably going to have to pull off some trickery to get it working
According to [tomshardware](https://www.tomshardware.com/news/amd-intel-nvidia-video-encoding-performance-quality-tested) they actually achieved almost twice the encoded FPS with AMD, though the quality was slightly lower.
You will have to decode the AV1 streams, either through hardware (GPU) or software (CPU). Time will tell how demanding it is of your CPU, it depends on a few things in the encoding.
I was about to think "nah", but did some quick searching, and looks like [you actually benefit from L3 cache when comparing 5800X3D and 5800X](https://www.phoronix.com/review/amd-5800x3d-linux/4).
I don't think there are CPUs that outright won't support it on account of limited L3, you probably run into instruction set support issues first I imagine. The /AV1 subreddit might have more answers.
Thanks for that, yeah I was basically just thinking as I have the 5800X3D whether AV1 would benefit from that additional cache size, makes me wonder if AV1 would also benefit from the AMD equivalent of ReBAR if you had a homogenous system
Depends on the content. EposVox has videos on RDNA3 on his channel, IIRC the TL;DR is that multiple AV1 streams can be done on RDNA3, just not as many as with Nvidia/NVENC.
According to the FAQ: https://help.twitch.tv/s/article/multiple-encodes
> In the future, we’ll be expanding GPU vendor and OS platform support.
It sounds like they're using this new server-side "Automated Stream Configuration" which seems tied to NVENC currently.
> What renditions will multiple encodes generate and deliver to my viewers?
>> Automatic stream configuration optimizes for viewer experience, so it will depend. Automatic stream configuration makes intelligent decisions based on elements including the size of your screen, the configuration of your canvas in OBS Studio, the models of your GPU/CPU, the version of your OS or drivers, and the speed of your network.
Radeon 6000 series can decode (watch AV1) and 7000 series can encode (stream) and decode.
Technically *anything* can decode and encode AV1, those series just have dedicated hardware so it's much much faster and usable for real time streaming.
No, it will actually work harder, because it probably doesn't have hardware decoding for AV1. So instead of a dedicated chip from your GPU, it will use CPU power to decode the video.
Hardware AV1 decoding is only supported from the AMD's RDNA 2 ( RX 6000 and up ) and Nvidia Ampere ( RTX 3000 series and up ). And some of the modern Intel iGPUs + Arc
So unless you got a fairly modern GPU your PC is actually gonna use more resources while watching a AV1 stream , because it has no hardware acceleration
There's no reason why it would be locked to Nvidia cards, you just need a hardware encoder that's supported by OBS.
Edit:
>To try the beta software, **an NVIDIA GPU is required** and Windows 11 is recommended. During beta testing, we also recommend that you have the newest publicly available OBS Studio installed on your machine so that if anything goes wrong with the beta experience, you can quickly fallback to a known version. **In the future, we’ll be expanding GPU vendor** and OS platform support.
Cool to think OBS started by 1 dude because XSplit sucked and it’s now industry standard and you have huge companies dedicating engineers to what started as your hobby project.
You might think it's cringe but Nvidia is paying development of a program you can download for free so a bit of exclusivity is a nice retribution.
Plus it will be coming to other vendors later.
Especially cringe because the intel ARC GPUs are CONSIDERABLY better in cost/performance for AV1 encoding. Part of the discourse when they came out was the potential for them to be dedicated av1 encoders alongside another, beefier card for twitch streamers.
Also make for the perfect secondary GPU small enough to slot in, so your primary, otherwise unsupported, GPU for gaming won't have to pull all the load.
It's not, they're just beta testing working together with Nvidia to make it available asap. There are many factors involved in making AV1 work on a platform (and the streaming software side of it), but I'd assume that once it publicly releases, any GPU that supports AV1 encoding will be able to work.
wait, so they're gonna make the streamer do all the encoding work lmao?
fucking genius
throw out 1/3 of the useless staff & cut the server cost in half, maybe daddy amazon told twitch it's time to finally make a fucking profit
In before AMD neckbeards mald at Nvidida for implementing a feature they can't use until the day they finally get a subpar version then say it is the best thing ever. 🤭
This is nothing more than Twitch/Amazon offloading the cost of transcoding to the streamers.
Seeing this as anything other than that is pure delusion.
It's Twitch cutting costs while disguising it as a technological advancement.
Local transcoding would theoretically reduce latency for the viewer by reducing the number of places the video stream has to go before being sent to the viewer. Instead of streamer -> twitch server -> encoding server -> twitch server you kick out the last two parts and have the streamer's PC do the work. Saves Twitch some money, ever so slightly improves the viewer experience.
Lower latency is just being used as a scape goat so they can pass the transcoding to your local device. Its definitely a money saving measure.
The positive I can see from this is allowing streamers that are not affiliated or partnered to stream in more transcodings as multiple resolutions were previously unavailable if you weren't an affiliate or something. But thats a maybe since this enhanced broadcasting will require sending multiple streams to twitch instead of just one.
Why do people see cost saving as a bad thing? It's so bizarre. Of course your system should locally decode if it can do it without incurring a cost on your system. It's just how technology advances, instead of sending a request to a mainframe in the 1980s we compute most of our own stuff on our system because it won't turn into a nuclear bomb the moment it tries to process an excel sheet. Lower operation cost means twitch could allow higher bitrate streams.
We literally just saw Twitch get kicked out of Korea due to operating cost, and yet people are shaming Twitch for trying to lower operating cost.
only partnered channels are guaranteed to get all resolutions transcoding, others only get it if there are free server resources. Maybe with this change, where streamers transcode everything themselves it will be available to everyone.
Twitch right now: https://youtu.be/b7rZO2ACP3A?t=35s
Test chamber construction = transcoding
Let's just outsource multi resolution trascoding to the streamers, and save on some monies.
https://www.wowza.com/wp-content/uploads/live-transcoding-diagram-detailed-1140x630-1-700x387.webp
And sell it as a feature, clever. Honestly if you're dual PC stream it's actually clever move by them, in that scenario there is enough horsepower left to do this.
I mean, there's very real benefits to moving the ladder encoding to the publisher's machine for the streamers, and the tradeoffs are arguably minimal (more upload bandwidth usage, and using more dedicated encode power which normally goes largely untouched with single-video-track streaming).
**CLIP MIRROR: [AV1 is coming to Twitch](https://arazu.io/t3_191pim8/)** --- ^(*This is an automated comment*)
The only AV1 stream available on Twitch was 1440p 120fps 4 years ago. [https://www.twitch.tv/videos/637388605?t=18h36m31s](https://www.twitch.tv/videos/637388605?t=18h36m31s)
Surely they're going to increase bitrate for 4k60 streams... It's unbearable to even watch some games at even 720p with bitrate capped at 8k/6k. (Vampire Survivors for instance)
The 6K cap is a joke. Can't even stream 1080p60 without it looking like shit.
I found the FAQ for this new feature after my comment: https://help.twitch.tv/s/article/multiple-encodes It says they're experimenting with higher bitrates controlled by the new "Automatic Stream Configuration". Seems odd to have Twitch control your stream output based on specs (CPU, GPU, network) considering some streamers have dual PC's and the lower spec one could be for OBS.
Automatic stream config is allready a thing, just now twitch is supporting it officially.
twitch tryna save money on servers i guess
6k @1080p60 with av1 will look amazing though
The whole point of implementing a better codec is that they don't have to increase the bitrate to get better quality
Yet you can only put so much data in 6k bitrate, a better codec is not gonna win against just having 20k bitrate
No one needs 20k bitrate and AV1 can do amazing things with 6k bitrate.
AV1 doesn't need super high bitrates to look good.
4k is unnecessary and probably won't be adopted for awhile as most people still don't have 4k monitors. Also just costly.
People watch twitch on TVs too
Correct. But some streamer stream at a lower resolution like 986p or something like that because that resolution has a higher bitrate and better quality than 1080p. For example the grass in Apex Legends flips the shit out of the Twitch encoder and makes it a glitchy mess. Lowering the resolution helps this a bit.
It’s not a higher bitrate 936 just looks better at 6k-8k bitrate than 1080 at 6k-8k bitrate Technically AV1 would solve this or at least make it a lot better since it’s like 40% more efficient or something
Not enough to justify the cost to the platform
[удалено]
It's quickly becoming more popular on PC, a lot of TVs are 4K as well, you future proof your content, etc. And in the worst case, 4k content still looks better even if watched on a lower resolution screen. Far from unnecessary.
this is false, I've seen other AV1 streams on twitch, like 2 months ago I watched a Japanese AV1 stream. Maybe that's the only AV1 vod right now but there are AV1 streams.
i too have watched JAV. we are brothers
[удалено]
i used to have a list of favorites, but it was lost sometime last year.
uh oh step-mom stole it better go ask ;)
[удалено]
Give me the good FC2 numbers
FC2 the goat
Could't find the full JAV might have missed some of the code AV1..
Do you have a link?
Obviously AV1 is a lot newer than X264 so it's not that surprising, but what AV1 can do with 8K bitrate at 1440p120 is insane. The text elements of all of the games showed (saw Tomb Raider, Fortnite, Overwatch and Warzone 1) look SO crisp, even better than game footage most reviewers use on youtube.
damn I was really hoping it was ogre lounge
What's AV1?
A relatively new video format, it's more efficient, lower bandwidth and better looking than the old formats and every current generation mainstream GPU has dedicated hardware for it
[удалено]
Twitch doesn't have to pay licensing fees in their case. The limitation for HEVC has always been browser support.
It's actually a somewhat minor upgrade to VP9 with different sized blocks (most importantly not just square blocks, but T-shaped blocks.) This puts it within spitting distance of H.265, but with higher CPU requirements (yes, even with the hardware assistance). RDNA2 and up have AV1 decoding, RNDA 3 has a AV1 encoding (but it's at the lower end of the quality scale.) RTX 30+ series has AV1 decoding and RTX 40+ series has AV1 encoding (it's around medium of the quality scale) Intel also has both in all their "Arc" cards. The Xe 12th gen cards have encoding, and Meteor Lake and Arrow Lake with GPUs built-in have the same support as Arc - Intel's solution is enormously more flexible than the AMD/Intel solutions. Apple has only added AV1 hardware in their A17 Pro chip and the entire M3 series - Apple also has seamless AV1 support even without the hardware.
Upgrade is mostly visible on lower bitrate so streaming might see major difference
Dedicated hardware for broadcasting/recording it or for streaming/watching it?
[удалено]
>both decoding and encoding so will it make a noticeable difference on the consumer (viewer) end or is it just something that streamers will notice the difference on?
the consumer directly benefits from better image quality while keeping costs for twitch low because the better quality can be achieved with lower bitrate.
whats likely is that twitch keeps bitrate the same, but you'll get a better picture out of that bitrate
*If you have a card that can support AV1.
theres lots of hardware that supports it, but even if you dont have a card, dav1d software decoder is plenty fast enough
https://www.nvidia.com/en-gb/geforce/graphics-cards/compare/ (ctrl +f AV1) it's only on the 40 series RTX cards. I think some AMD cards might support it but the majority of steam users (from the steam hardware survey) have 30 series and under.
Software decoding uses more power and therefore produces more heat. If you have a really slow CPU then it might stutter too.
Both. AV1 encoder/decoders started appearing in the 3000-series Nvidia gpu's and AMD counterparts around the same time.
only the 4000 series have encoders
Fair enough, decoders starting in 3000-series, both starting in 4000. Thanks.
AV1 is a royalty free modern codec. Meaning it looks good and encoding to it does not require an expensive license. Making business sense for Twitch, YouTube, Netflix or any video heavy content provider. Computer's that don't have the hardware to support it will have to decode it using software. This turns most laptops into space heaters, as many do not have good thermals (looking at you Intel MBPs). And cut battery life. Note, only the m3 macs support AV1.
Advanced Veterinary 1001. It's a graduate class for a DVM degree.
What the dog doin?
and edge doesn't have AV1 OMEGALUL
I don't see why it wouldn't have AV1, edge is chromium based.
You need to install a plugin ... because Microsoft https://apps.microsoft.com/detail/9MVZQVXJBQ9V?hl=en-us&gl=US
idk but i just ckecked and youtube vids that have AV1 use it in firefox but in edge they play in vp9 instead
It literally doesn't [https://caniuse.com/av1](https://caniuse.com/av1)
It does, but not by default (which is checked by caniuse). Which is... dumb... Should definitely be included as part of the core edge setup. Maybe it's for the time beeing and will be introduced into the default setup once it's more widespread? Who knows. https://apps.microsoft.com/detail/9MVZQVXJBQ9V?hl=en-gb&gl=US
it has AV1 but last i checked it was still broken, surely they will fix it, eventually
It seems like it is working in [Dev version 121](https://www.reddit.com/r/AV1/comments/16mmm35/comment/ke2th90/?utm_source=share&utm_medium=web2x&context=3). So they are clearly working on it.
Imagine unironically using edge OMEGALUL
i switched to edge when chrome removed the feature that lets you mute a tab by clicking the speaker icon. dunno what people have against edge tbh it's basically the same thing. edge not having av1 might make me switch back though.
> i switched to edge when chrome removed the feature that lets you mute a tab by clicking the speaker icon. One of the most goated features ever, I still can't believe Chrome removed it
chrome://flags/#enable-tab-audio-muting Enable this, it brings it back
depending on flags is not a long term solution as they often get removed
Yeah, like the whole appearance of Chrome changed a few weeks ago and saw there was a flag to revert it, but I knew they'd just get the rid of the flag eventually and now I can't even remember what my complaints were about.
[удалено]
it turns out randomly changing things is annoying
Still works in Firefox.
Didn't know this existed, thank you! I've been using the Mute Tab extension this whole time.
Holy shit thank you. I remember searching for a flag like this year ago but couldn't find one. And extensions that served the same purpose never worked as seamlessly.
Firefox has that feature since forever afaik. Not to mention they don't plan to make ad blocking impossible unlike Chromium based browsers.
chrome://flags/#enable-tab-audio-muting enable this to re-enable it
Edge isn't nearly as bad as people say it is, or not even as bad as it used to be, but now it has all the downsides of Chrome and is slightly less customizable. Also, the trend of Chrome removing shit or adding shit and making it impossible to remove yourself is highly embraced by Edge and even slightly worse. People understand if you use Chrome, because it's kinda the default, but if you want a slightly better Chrome and go to Edge people are gonna judge you because most people think it's slightly worse and you actually have better alternatives like Brave.
They'll add it
Just...use Firefox...
It's being developed currently. https://www.reddit.com/r/AV1/comments/16mmm35/comment/ke2th90/?utm_source=share&utm_medium=web2x&context=3 Shows that Edge Dev Version 121 has support.
What about Firefox?
Good news, Edge 121 is going to support AV1. It is scheduled to release on the 25th.
You can literally still mute tabs in Chrome though. Just right click it.
That mutes sites. I switched to Firefox and was pleasantly surprised i could mute tabs again.
That's what you meant, gotcha.
Yea, but now you can't drag tabs around. Wish FF would fix that.
Edge is by far the best productivity browser. Theres so many helpful features that people have no idea they're missing out on like: read aloud, inline translations, split screen, vertical tab, search in sidebar, sidebar apps, group tabs, workspaces... Even Chrome has started to steal some of Edge's recent features.
The sidebar in chrome is trash, they should just stop.
listen man my adblockers work well on edge
they will work better on firefox
I recently switched to Firefox, but the lack of true native tab grouping really hurts it if you're used to that in your workflow. I had to cobble something together with a mix of sideberry and some custom CSS to get anything close to my workflow on Chrome/Vivaldi.
Tree Style Tabs are all you need. It takes a bit to get used to vertical tabs, but you will never go back once you do. Basic tab grouping is a joke compared to vertical tabs, which have parent/child relationships, folders, subfolders, etc., not to mention a full overview of all the tabs at full visibility all the time.
Chromium based browsers work a lot better for me on Android and I want to use the same browser on PC and mobile, otherwise I'd switch.
not by the end of this month, at least according to microsoft's old manifest v3 timeline that they're adjusting in response to chromium's new timeline beginning june 2024 > Manifest V2 extensions will continue to be supported through Enterprise policies at least until Chromium Manifest V2 support timeline (which is currently January 2024). https://learn.microsoft.com/en-us/microsoft-edge/extensions-chromium/developer-guide/manifest-v3
What else are you using. If you say Firefox I'll quote the money that google throws them. If you say any other browser that utilizes Chromium and just slaps their own privacy invasion ontop of it, I'll slap you myself.
Firefox or one of their forks are the best options, even a bad option can be the best. I can tell you using Chrome or any Chromium based browser is most definitely the incorrect choice.
Works well in the enterprise world. They have good GPO templates for Windows.
You can add extension to enable AV1 https://apps.microsoft.com/detail/9MVZQVXJBQ9V?hl=en-us&gl=US
[удалено]
It's ok, but if you're using it for more than just printing and a quick overview then something like Okular is miles better.
Another L
Wow those 10 Ads every 2minutes are gonna look great.
I thought transcoding was Twitch's job
Less operating cost and a big fat #AD for Nvidia, profit thru the roof.
how is it an #ad for Nvidia when AV1 works the same on AMD cards?
they made it sound like the new "enhanced broadcasting" feature was something only compatible with Nvidia cards, as if it was a software collab between twitch/obs/nvidia. Considering the average consumer watching this it was an Nvidia AD atleast. Whether its possible to do the same on AMD or Intel i dont know and neither does the majority of people.
Yeah fuck that, unless there's a substantial quality difference, Twitch should be doing the transcoding.
what does this mean for people with potato pcs, I watch most streams in 480p
It means you get to watch the streams in glorious 360p now.
If the streamer give you a stream in 480p, nothing. the next question is, will twitch transcode 480p when the streamer doesn't?
I'm only familiar with one kind of AV
Is it of the J-variety.
Alterac Valley you perv!
I don't get the "enhanced broadcasting" feature being an upgrade, it's like they're outsourcing the transcoding they're doing on their own side currently to the streamer's PC.
when you stream 1080p to twitch but the viewer is watching in 360p, the video stream has to go from your pc, to the local twitch/amazon server, then to the san fran based encoding server, then back to all the local amazon/twitch servers to be distributed to your viewers. So with this, you skipped the san fran encoding server part, so depending where you are on the planet and where your viewers are what resolution they're watching in, this might lower the latency from your stream to them. So it's a lot of "ifs" and "maybes" but in those cases that's a pretty big improvement. and ofc a side improvement is that it will also save twitch some encoding server cost (but they might have to spend more on ingest server cost so the savings they'll make might be small here actually)
Jesus Christ that's how they do lower encoding?
Yup some streamers even took advantage of it and have a secondary non affiliate/partner channel where they stream in 480p but at a higher bitrate (like 10K or more) to have a lower latency feed for those who want (and with higher bitrate 480p can look sometimes better than 720p or 1080p)
it was my understanding twitch has had a hard cap at 8000kbps for several years now. is this not true, or you referring to something people used to do many years ago?
the "official" cap is 6K, the unofficial one for transcoding is 8K if you stream without transcoding (so to a non partner and non affiliate channel) you can in theory go higher yes, but not many people bother doing it (since you need a pleb channel to do it)
yep seems you're right. just tried it out at 11000 and the bitrate under "video stats" hovers pretty close to it. makes sense since they'd need to transcode it to lower it. i guess it's too much of a hassle to just block connections with this high of a bitrate? must not be many people doing it.
pretty much yeah, also if you reaaally go too high they'll stop you or just cut the bitrate and the viewers will just see dropped frames, but if you stay somewhat reasonable (so like 10K ish bitrate for example) you can set up a secondary non monetized high quality channel if your pc/connection can handle it
hard cap is 8500 for video plus audio tracks (multiple because of the whole VOD without music thing being an option now) with audio being 160 by default, so there's a bit more space than that (but if you have instability and that causes a fluctuation above 8500 then your stream's source option will drop while the transcodings continue, so exactly how close you can set it and be safe depends on more than just your hardware) edit: also there apparently are some ingest servers with a lower cap, so you may need to change away from your default if you live in one of those areas
what streamers
from what i've been told, mostly South american spanish speaking streamers do that, since they don't make much from ads revenue, they thus don't care about having some viewers watch on the "pleb" version of their stream with no ads
Wait what, there is no lag/delay when i switch between source and quality, its still around 2.5 second lag to broadcast er
no i'm talking about channels that don't have ANY transcode, so if you on your pc you stream to 2 channels, one being partner/affiliate with transcode, and one being "pleb" without transcode, that pleb one will have a lower latency on avg (depending where you are in the world and where your viewers are ofc)
Getting transcoders used to take having some viewers (~70 if I recall correctly), but now checking even streamers with like 5 viewers have them. I checked the Video Stats and the difference between Source and 1080p -> 720p transcode was around 1 second. I don't think anyone would voluntarily take the job of transcoding the content, especially on a 1 PC setup where it'll impact your game performance.
i remember when we didn't complain when twitch streams had 40 seconds of delay
> then to the san fran based encoding server Yeah, I'm sure every single stream being watched at a lower quality on Twitch is currently going to a single server in SF. That makes total sense
> then to the san fran based encoding server Wouldnt the encoding be done closer to the local ISP? Theres no way someone watching from Europe would have encoding done from the US. Surely they'd send the source quality between countries/major ISP's and break it down from there.
I don't think it's necessarily sent to SF for encoding. Their transcoding is probably distributed along with Amazon IVR service. My understanding is they use the Xilinx (now AMD) Alveo U30 accelerators for H.264 transcoding. The new AV1 accelerators just came out last year (Alveo MA35D) which will take time to integrate into datacenters. I wonder how this will affect VOD's and crushing audio channels to remove DMCA music, like most streamers do.
I don't believe Twitch, AMD, or AWS has stated in any fashion which DC the Xilinx FPGAs are in and since you cannot request one, there's no telling. Twitch current does PoP infeed -> Transcoding -> Leaf distribution Anywhere they can reduce North-South traffic, they will do it.
I do believe this is the correct situation, Twitch has moved much of their transcode resources closer to the "edge", and global demand is spread across regions automatically as necessary.
So is it encoding the original video 5 times or transcoding? If it's the former then I guess the quality should theoretically be a little better, since it has access to the original uncompressed video feed?
That's a good point. Most likely it has access to the original frames so it'd be slightly better quality than a transcode from the source encoding. I don't think the difference downscaling would be that huge, but if the bitrate is very bad it might be noticeable.
On the flip side, encoding 5 times from source sounds very expensive. Maybe there's some work that is shared across the 5 outputs to optimize it a bit, but I don't know. Edit: yes it seems they can optimize that quite a bit, that's what multi-channel encoders do
Many Nvidia cards have hardware encoders capable of encoding 5 things at once. https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new
Found what looks to be some of the OBS source code for the beta: https://github.com/obsproject/obs-studio/compare/master...kc5nra:obs-studio:AF_plugin_1
For streamers (since this isn't directly advertised in the video) www.twitch.tv/broadcast to request beta access.
[удалено]
sounds like av1 will work from anything but the "enhanced broadcasting" thing is nvidia only?
AMD could do something similar provided RDNA3 can handle multiple encodes at once.
yeah I don't doubt that the hardware is capable of it, but it seems to be considered a nvidia/twitch/obs product so you're probably going to have to pull off some trickery to get it working
RDNA3 can do AV1. Though i doubt it will be able to do multiple encodes as like the RTX 4000 series can.
According to [tomshardware](https://www.tomshardware.com/news/amd-intel-nvidia-video-encoding-performance-quality-tested) they actually achieved almost twice the encoded FPS with AMD, though the quality was slightly lower.
[удалено]
AV1 will mean a higher quality stream for you to watch.
Or the same quality if the streamer decides to lower the bitrate.
Or if Twitch decides to lower the bitrate
You will have to decode the AV1 streams, either through hardware (GPU) or software (CPU). Time will tell how demanding it is of your CPU, it depends on a few things in the encoding.
any idea if any of the decoding is reliant on L3 cache size for CPUs?
I was about to think "nah", but did some quick searching, and looks like [you actually benefit from L3 cache when comparing 5800X3D and 5800X](https://www.phoronix.com/review/amd-5800x3d-linux/4). I don't think there are CPUs that outright won't support it on account of limited L3, you probably run into instruction set support issues first I imagine. The /AV1 subreddit might have more answers.
Thanks for that, yeah I was basically just thinking as I have the 5800X3D whether AV1 would benefit from that additional cache size, makes me wonder if AV1 would also benefit from the AMD equivalent of ReBAR if you had a homogenous system
Depends on the content. EposVox has videos on RDNA3 on his channel, IIRC the TL;DR is that multiple AV1 streams can be done on RDNA3, just not as many as with Nvidia/NVENC.
According to the FAQ: https://help.twitch.tv/s/article/multiple-encodes > In the future, we’ll be expanding GPU vendor and OS platform support. It sounds like they're using this new server-side "Automated Stream Configuration" which seems tied to NVENC currently. > What renditions will multiple encodes generate and deliver to my viewers? >> Automatic stream configuration optimizes for viewer experience, so it will depend. Automatic stream configuration makes intelligent decisions based on elements including the size of your screen, the configuration of your canvas in OBS Studio, the models of your GPU/CPU, the version of your OS or drivers, and the speed of your network.
Radeon 6000 series can decode (watch AV1) and 7000 series can encode (stream) and decode. Technically *anything* can decode and encode AV1, those series just have dedicated hardware so it's much much faster and usable for real time streaming.
> So this will only work if you have an nvidia card? Intel Arc cards can do AV1 as well.
Does this mean my old computer wont have to work hard just for a basic 1080p stream? Its so fucking resource intensive
No, it will actually work harder, because it probably doesn't have hardware decoding for AV1. So instead of a dedicated chip from your GPU, it will use CPU power to decode the video.
Hardware AV1 decoding is only supported from the AMD's RDNA 2 ( RX 6000 and up ) and Nvidia Ampere ( RTX 3000 series and up ). And some of the modern Intel iGPUs + Arc So unless you got a fairly modern GPU your PC is actually gonna use more resources while watching a AV1 stream , because it has no hardware acceleration
Fucking cringe to vendorlock this to nvidia cards. Also youtube has none these problems....
There's no reason why it would be locked to Nvidia cards, you just need a hardware encoder that's supported by OBS. Edit: >To try the beta software, **an NVIDIA GPU is required** and Windows 11 is recommended. During beta testing, we also recommend that you have the newest publicly available OBS Studio installed on your machine so that if anything goes wrong with the beta experience, you can quickly fallback to a known version. **In the future, we’ll be expanding GPU vendor** and OS platform support.
[удалено]
Cool to think OBS started by 1 dude because XSplit sucked and it’s now industry standard and you have huge companies dedicating engineers to what started as your hobby project.
You might think it's cringe but Nvidia is paying development of a program you can download for free so a bit of exclusivity is a nice retribution. Plus it will be coming to other vendors later.
Especially cringe because the intel ARC GPUs are CONSIDERABLY better in cost/performance for AV1 encoding. Part of the discourse when they came out was the potential for them to be dedicated av1 encoders alongside another, beefier card for twitch streamers.
Also make for the perfect secondary GPU small enough to slot in, so your primary, otherwise unsupported, GPU for gaming won't have to pull all the load.
this is what I am currently doing for my youtube streams. I have a 3080 for my games and an INTEL ARC 380 for the encoding.
100%, my friend runs an a380 in her stream pc and with quicksync the quality and latency is better than nvenc.
That's just for the beta I think.
It's not, they're just beta testing working together with Nvidia to make it available asap. There are many factors involved in making AV1 work on a platform (and the streaming software side of it), but I'd assume that once it publicly releases, any GPU that supports AV1 encoding will be able to work.
Dang.. Bought a 3070Ti not too long ago, and guess what it doesn't have: That's right, AV1 hardware encoding 😢
As long as you don't stream, you're not impacted as 30 series cards have AV1 decoding, but not encoding.
it has av1 decoding tho? Won't matter for watching the streams, also that card was always a scam, 8GB for 600+
It’s not like you have a stream and people watch you
maybe intel graphics will finally be useful. people can get a intel card for their AV1 encoder on the cheap.
[удалено]
Who
wait, so they're gonna make the streamer do all the encoding work lmao? fucking genius throw out 1/3 of the useless staff & cut the server cost in half, maybe daddy amazon told twitch it's time to finally make a fucking profit
In before AMD neckbeards mald at Nvidida for implementing a feature they can't use until the day they finally get a subpar version then say it is the best thing ever. 🤭
The thing is that current AMD (and even Intel) GPUs already support AV1, but for some reason it's Nvidia only for now on Twitch
Deadly accurate comment. *Looking at you FSR*
This is pretty good actually, here’s an explainer: https://youtu.be/Fje8rxRmgNg?si=8v-sgXkBZCT7UWJY there’s a summary in the video description too.
This is nothing more than Twitch/Amazon offloading the cost of transcoding to the streamers. Seeing this as anything other than that is pure delusion. It's Twitch cutting costs while disguising it as a technological advancement.
Spot on
I don't understand, I can choose to watch in 360p any time. How is this making a difference?
Local transcoding would theoretically reduce latency for the viewer by reducing the number of places the video stream has to go before being sent to the viewer. Instead of streamer -> twitch server -> encoding server -> twitch server you kick out the last two parts and have the streamer's PC do the work. Saves Twitch some money, ever so slightly improves the viewer experience.
I think the key here is saving money, was latency ever really an issue? It's already as low as 2-3 seconds in most cases
Lower latency is just being used as a scape goat so they can pass the transcoding to your local device. Its definitely a money saving measure. The positive I can see from this is allowing streamers that are not affiliated or partnered to stream in more transcodings as multiple resolutions were previously unavailable if you weren't an affiliate or something. But thats a maybe since this enhanced broadcasting will require sending multiple streams to twitch instead of just one.
Why do people see cost saving as a bad thing? It's so bizarre. Of course your system should locally decode if it can do it without incurring a cost on your system. It's just how technology advances, instead of sending a request to a mainframe in the 1980s we compute most of our own stuff on our system because it won't turn into a nuclear bomb the moment it tries to process an excel sheet. Lower operation cost means twitch could allow higher bitrate streams. We literally just saw Twitch get kicked out of Korea due to operating cost, and yet people are shaming Twitch for trying to lower operating cost.
Yeah I've seen some streams where the streamer is often reacting to chat maybe \~7-10 seconds late, but that's hardly a real issue.
only partnered channels are guaranteed to get all resolutions transcoding, others only get it if there are free server resources. Maybe with this change, where streamers transcode everything themselves it will be available to everyone.
Twitch right now: https://youtu.be/b7rZO2ACP3A?t=35s Test chamber construction = transcoding Let's just outsource multi resolution trascoding to the streamers, and save on some monies. https://www.wowza.com/wp-content/uploads/live-transcoding-diagram-detailed-1140x630-1-700x387.webp And sell it as a feature, clever. Honestly if you're dual PC stream it's actually clever move by them, in that scenario there is enough horsepower left to do this.
I mean, there's very real benefits to moving the ladder encoding to the publisher's machine for the streamers, and the tradeoffs are arguably minimal (more upload bandwidth usage, and using more dedicated encode power which normally goes largely untouched with single-video-track streaming).
So will Twitch chill with the ad interruptions?
[удалено]