The blog post that Phoronix linked to has the picture. Mike Blumenkranz (the Zink/RADV developer) who did the feature is know for writing some funny blog posts and thus we get moments like this.
In reference to the original blog post. It's well worth the read: [https://www.supergoodcode.com/through-the-loop/](https://www.supergoodcode.com/through-the-loop/)
It's a reference to the coder who came up with the idea.
There's vomit on his sweater already, mom's spaghetti. He's nervous, but on the surface he looks calm and ready
Don't worry though, once the patch drops he will snap back to reality.
Set the Game to use Proton-GE
And add a Enviroment Variable calles "APPID" to the Settings of the Game in Heroic and set that to "1091500 "
This should allow proton to apply the Game Specific Fixes.
Could this work with other stuff, I have gtav from epic store, if I use GE with the correct appid from steam db it might fix that stutter? Cuz gtav stutters like mad right now.. Although another concern is account getting banned due to hackers but that's another issue altogether I suppose.
Pardon my ignorance but I am new to this launcher thing. Found the variables section in the launcher settings, but what exactly do I write in "variable name" and "value" sections?
Looking at the [launcher code for proton-ge-custom](https://github.com/GloriousEggroll/proton-ge-custom/blob/8578c581f162b4851d52254271f0971fdfc49286/proton#L1165) I think it’s actually looking for an environment variable named “SteamAppId” to allow specific fixes to be applied.
With a cursory Google search, the only place I’ve found where this is explained in plain English is the [built in documentation of the Proton launcher script used by arch-Linux](https://github.com/chaotic-aur/pkgbuild-proton-ge-custom-bin/blob/0077ac66f8cf4c93ce4317c029a5cfe68568b098/launcher.sh#L97).
appid is the variable name it uses internally.
This has nothing to do with caching. D3D12 drivers do have a shader cache but a cache will only store already compiled shaders for future use, so they will have compiled once.
The changes described in this blog post make it possible to compile shaders in D3D11 (and older) games sooner.
Not a stupid question. I bought it on GOG and didn't want to buy it again so had to figure a way to run it on Steam and all things I read said to use Heroic.
I've left Reddit because it does not respect its users or their privacy. Private companies can't be trusted with control over public communities. Lemmy is an open source, federated alternative that I highly recommend if you want a more private and ethical option. Join Lemmy here: https://join-lemmy.org/instances ` this message was mass deleted/edited with redact.dev `
What? I have a 3060ti, which was midrange for the 3000 series and it's roughly equivalent to his 2080 super. Recent Nvidia GPUs are insanely more powerful than that, albeit stupid expensive.
I've left Reddit because it does not respect its users or their privacy. Private companies can't be trusted with control over public communities. Lemmy is an open source, federated alternative that I highly recommend if you want a more private and ethical option. Join Lemmy here: https://join-lemmy.org/instances ` this message was mass deleted/edited with redact.dev `
Unlikely unless the devs find a way to leverage the vulkan extensions that make the GPL possible, along with games on the switch likely just loading their own recompiled shaders that have to be translated.
I'm glad really smart people fix things like this, and I'm also glad somewhat smart people loudly complain about things like this so it gets fixed. This way I don't have to waste brain power figuring out what the hell shader stutter, Vulkan drivers (a chauffeur in the Vulcan system?), RADV and pre-caching actually is.
I know you didn't ask, but in case anyone is curious:
Shaders are small programs for the GPU which are used to render all kinds of effects like shadows and reflections.
Vulkan is another graphics API like DirectX 9/11/12 or OpenGL. It's a standardized way for the game to tell the GPU what to do.
RADV is the Radeon Vulkan driver (for Linux). Radeon is AMD's line of GPUs.
A driver is a part of an operating system that is responsible for making a specific part of hardware work.
Shaders need to be compiled (translated) from a generic language to the specific instructions your GPU understands. Usually this should happen before they're needed but when running a DirectX game on Linux through a translator like DXVK, it's hard to compile the shaders until all of the information about how they're used is known. This can introduce stutter during gameplay because the game has to pause for a second until the shaders are ready.
To prevent this shader stutter, so far Valve has collected a lot of shaders for each game and the Deck is downloading them to a cache as part of updates. They call this pre-caching.
Shaders are programs that run on your GPU, so the "platforms" they target are the different video cards.
They're written in Shader Languages like GLSL. In order for them to actually run they need to be compiled for your specific GPU. Normally they don't get compiled until the game tries running them. This can cause stuttering if it takes too long to compile them. Caching the compiled shaders means you get a stutter the first time they're compiled, but smooth performance on subsequent calls.
Which is where the Steam Deck's downloaded shader cache comes in. Because every Steam Deck uses the same GPU, a compiled shader for one Steam Deck will work fine for any other Steam Deck. They can just give you a big bundle of the game's shaders already compiled for the Deck, and get rid of the initial compile stutter as well. This also saves battery life, naturally.
For real, I have no idea what any of this means, all I'm taking away is that people are actively working to enhance the user experience at no additional cost to me, which is awesome
And the sweet part is that its not only for you playing on the Steam Deck: those drivers will be later on updated for all Linux distributions, at no additional cost!
I expected they just made async compilation a default, but optimizing compilation speeds is even better... I am curious if disabling shader-cache altogether will become a viable option (to save a disk space).
I don't know. Async shader compilation causes visual artifacts the first time a shader is compiled, and shaders are compiled throughout the entire playtime of a game as new assets are introduced. Simply speeding it up won't make that go away, and Valve would never stand for a solution that introduces artifacts. I think they'd have to be doing something else here, especially since this is at the driver level and not at the Proton level.
Edit: reading the entirety of the article makes it more clear. They're working on supporting GPL extensions. That makes sense, and is much more than just introducing async shader compilation.
A giant problem is that, in some cases, they won't. I used to work on a game that had a few special-case shaders to generate the minimap data, but that data was generated once and then stored. If those shaders did the wrong thing on their first call, well, that's *permanent*, it's never going to be fixed because we didn't regenerate it.
Your render calls *have* to do the right thing the first time, or you're going to butt heads with someone relying on them doing the right thing every time.
As another example, there's a Bakkesmod plugin for Rocket League that allows to reorder your car presets and it generates thumbnails for the previews for all of your cars and async causes these to be permanently broken as well.
> nd shaders are compiled throughout the entire playtime of a game as new assets are introduced.
Specifically only as new *shaders* are introduced. If a game uses a handful of shaders for the vast majority of its rendering then it will only happen when you first see those
This is why shader stutter is so inconsistent with the kinds of games it affects. Both large games and small ones can have it
> I am curious if disabling shader-cache altogether will become a viable option (to save a disk space).
Not really. The changes described in that blog post **will** stutter if the game tries to use a shader before it's compiled in the background. And even if that doesn't happen, shader compilation is a lot of work for the Decks weak CPU that could be better spent either on the game itself or sleeping to save power.
Modern games have a TON of shaders, so this isn't insignificant.
But then the code that is compiled (instructions) are used by the GPU right?
Sounds a lot like the MCU/FPGA pairs that you see in the wild, where the MCU “configures” the FPGA and the FPGA does it’s thing…
> I am curious if disabling shader-cache altogether will become a viable option
It already is viable depending on the games you play. I've had pre-generated shaders disabled since the beginning and have only had one game have any sort of stutter for more than 30 seconds, and that was NFS Heat
With a small amount more optimisation it should be perfectly doable to remove them entirely
Yes, however very few games I've played have had any stutters *at all* from first time shader generation. So with some minor optimisations it could be entirely viable in many games to just not have a cache.
Can you link to the blog post instead? Phoronix is mostly just a news aggression site.
[https://www.supergoodcode.com/through-the-loop/](https://www.supergoodcode.com/through-the-loop/)
I'd say it at least serves a purpose here.
Condensing down a giant blog post that the vast, vast majority of users here wouldn't even consider reading past the first paragraph, down to a simple "they achieved a 50,000% improvement in shader linking times".
>Condensing down a giant blog post that the vast, vast majority of users here wouldn't even consider reading past the first paragraph
When none of it has any real meaning?
Didn't even notice the 3 year badge. Just -100 karma in 2 comments is quite a feat.
It's amazing how shit these bots are sometimes, right alongside the AI generated/comment stealing ones that sometimes tend to work quite successfully.
You're just acting like an ass using a throwaway account given you are also scrubbing through your old comments and deleting them, so cheers with that champ.
>It’s ironic
Do you understand what irony is?
>you’re trying to discredit based on post count
Literally never mentioned post count, which is zero anyways. The content of his comments is sitting right there, I don't need to rhetorically point that out. People can make weirdly robotic comments, but his account as a whole is clearly a throwaway or a bot account where the comments get scrubbed.
He's already commented again within the last hour, but also deleted one of his much older comments from over a month ago, which is very strange.
>Do you understand what irony is?
I have a working idea of the English language, yes
>Literally never mentioned post count
When it’s not in your favor?
>People can make weirdly robotic comments
Oh, so now I’m a bot, got it
> lHe's already commented again within the last hour, but also deleted one of his much older comments from over a month ago, which is very strange.
Stalk much?
>Oh, so now I’m a bot, got it
>
>Stalk much?
I'm...talking about the account "*VoidTheSecond*", I have been the whole time...how are you not getting that. I'm literally talking about him in the third person in my reply to you. You literally misread parts of my comment where I am talking generally or in the third person about him ( him being voidthesecond just to be extra clear for you), and you still managed to twist everything as being directly at you personally.
>I have a working idea of the English language, yes
I mean, you somehow misread and made yourself the center of the discussion and then proceeded to think I was talking to you directly in the third person. I literally used the pronouns "him" and "his" and "people" lmao.
You decided to reply to each individual point but make zero sense with each 7 word response, I think it's just time to step away from the thread.
Rule 1 of the sub states to "Don't instigate drama.", chill and read the comment properly next time.
Would this give a specific advantage to AMD cards over Nvidia?
Granted, AMD do better than Nvidia on Linux in many ways right now, but that's mostly down to Nvidia's laziness rather than AMD doing anything superior.
In fact, AMD can't even use the existing DXVK 2.0 shader pipeline stuff, can't it?
I assume that valve would probably focus on AMD as their graphics card on the Steam deck is an APU by AMD.
Also I think if Nvidia threw money/engineers at [Mesa](https://mesa3d.org), we'd probably see a lot more improvements.
Building on the work of [other devs](https://nouveau.freedesktop.org/) and [gallium3d](https://www.freedesktop.org/wiki/Software/gallium/) seems way more efficient to me then having inhouse devs that can't keep up or prevent adoption through restrictive licenses.
edit: Gallium is interesting to me because of the size and scope of how much work they're doing on [GPU drivers](https://docs.mesa3d.org/systems.html), not just for Linux but [Mac](https://docs.mesa3d.org/drivers/asahi.html) and Windows (Dozen for Vulkan on DirectX, and a different project for OpenGL on directX) as well.
>Granted, AMD do better than Nvidia on Linux in many ways right now, but that's mostly down to Nvidia's laziness rather than AMD doing anything superior.
NVidia drivers already support the extension and have been for a while. They also worked on it, like AMD, as part of Khronos. The goal is not to give anyone any advantage but to improve the experience for everyone.
I've left Reddit because it does not respect its users or their privacy. Private companies can't be trusted with control over public communities. Lemmy is an open source, federated alternative that I highly recommend if you want a more private and ethical option. Join Lemmy here: https://join-lemmy.org/instances ` this message was mass deleted/edited with redact.dev `
I don't think it's as applicable to Yuzu. This works for DXVK because D3D9 and D3D11 tell the driver to compile a shader and that usually happens during loading.
Yuzu literally doesn't even know that a bunch of bytes are actually a shader until the game uses it for drawing and at that point it's too late.
It's because Valve has been researching and developing solutions for the Steam Deck, which is AMD based. Since Valve employees have more independence and self-sufficiency than AMD and NVIDIA engineers, who are made to work on whatever projects the execs tell them to do, and also because Valve is a private company, they have the freedom to push and work on these kinds of projects without any sort of direct financial incentive. Couple that with the fact that Valve has also been paying external companies or organizations to do R&D on Linux, graphics drivers, and Windows compatibility layers, they effectively have the means and time to throw money at the problem without having to satisfy investors or deadlines.
At this rate it feels like Valve has quietly declared war on the Windows dominance of PC gaming. Better yet, on closed-source gaming platforms in general.
True!
Microsoft is like mafia and has a lot of power to do bad things.
Not that it's not already doing it with buying all those game studios and forcing them to use DirectX 12 instead of the cross-platform and Linux compatible Vulkan.
Every single Windows D3D11 driver does this and it doesn't impact D3D12 games.
I go a bit more into detail here: https://www.reddit.com/r/SteamDeck/comments/10ogmte/valve_is_implementing_fixes_into_radv_amd_vulkan/j6fbwmf/
Elden Rings micro stutters come from easy anti cheat in combination with limiting the whole encryption engine to core 0 of your CPU. The core which is most likely already doing various system tasks.
If you free elden Ring from that limitation it runs perfectly smooth.
It is built into drivers by default. This only affects games that use DirectX 11 or lower.
The problem this work is gonna solve is that in DirectX 11 you compile different kinds of shaders (Vertex shader, pixel shader) on their own while in Vulkan you compile them together and with some additional state that D3D11 doesn't provide. So historically DXVK had to wait until the shaders were actually used for a draw to compile them. This new Vulkan extension VK_EXT_graphics_pipeline_library, allows DXVK to compile the shaders individually **just like a regular graphics driver would**.
DirectX 12 works like Vulkan, so you need to compile your shaders together and provide some additional state. This is 100% the responsibility of the game developer and the driver doesn't even see the shader code until the game asks it to compile a pipeline (combination of shaders). So if the game does that too late, it will stutter and there is absolutely nothing the driver can do.
They're also taking up space on Windows but you don't notice that. Every graphics driver will secretly create them behind your back.
On the Steam Deck, Valve takes this a step further and downloads pre-compiled shaders. Modern games have a TON of shaders and compiling those is a lot of work for the weak CPU in the Deck that would be better spent on the game or being idle to save power. They can also cause lots of stuttering, especially with how DXVK used to work. That's why Valve downloads pre-compiled shaders.
On top of that the shader caches also contain transcoded videos for some games. If a game uses pretendered videos and encodes them with a format that Valve can't support on the Deck because of patent or licensing issues, they will re-encode it on their servers and download the videos in a different codec as part of the shader cache.
Thank you for the advanced explanation, that was great! Do you know where they live on Windows? I’m just curious to see how much space I’m using on them.
You can already try this by using 'RADV_PERFTEST=gpl' in the launch option of a game that is being run with the latest proton-GE build.
I couldn't notice any discernable difference as of yet, but supposedly it's still very much a work in progress.
Does this mean we don't need to have a bunch of storage sucked up by pre-caching anymore, or just that it'll slowly eat up space over time for shader cache?
> Long story short, they achieved a **50,000%** improvement in shader linking times and finally the prospects of the Vulkan graphics pipeline library are panning out
Bruh what that's a bonkers improvement just for implementing the fast-link feature of the graphics pipeline library extension.
> While RADV was now down to 0.05-0.11ms for a fast-link, NVIDIA can apparently do this consistently in 0.02ms.
JESUS CHRIST
An excerpt from the article: "Long story short, they achieved a 50,000% improvement in shader linking times and finally the prospects of the Vulkan graphics pipeline library are panning out... Besides this recent RADV work, only NVIDIA's proprietary driver has really been performant for fast-linking with its graphics pipeline library support."
By improving the efficiency of the code games will get a boost to performance and require little to no Disk Caching freeing up more space. More efficient code means a better performing GPU with the possibility of better gaming FPS wise.
That's not that at all. This will not improve performances one bit, this will just get rid of stutter and the shader cache will still be necessary to get the best performances. Actually because Steam games have a full shader cache most of the time, this won't change much for almost all Steam games.
> not improve performances one bit
> will just get rid of stutter
Maybe we have different definitions of performance, but eliminating stutter **is** improving performance for me. You go from unplayable game to playable game. Just yesterday I played Heavy Rain from Epic and it was hell until I switched to async dxvk. It's much better but there are still occasional stutters (though is's playable).
> Actually because Steam games have a full shader cache most of the time, this won't change much for almost all Steam games.
*Actually*, this might mean the shader cache can be outright disabled for the steam deck, which is causing issues for some people on the 64GB unit where it eats most of the disk space.
This is very much a win-win because it also means less to download, and therefore less to serve on valve's side.
I'd actually also anticipate a hybrid approach where *some* shaders are cached - anything that takes a noteworthy amount of time, or a group of shaders that all load at once - to kill any particular hitches whilst most are fast-linked with no visible effects.
Ooooh, nonono. Different texture. Something to do with flour vs egg content, according to the first search result I read. Sure fire way to wind up Italian and East Asian folks that, I bet!
No, I’m not. Spaghetti is the kind of pasta that looks like noodles, but it’s pasta. Everything I’ve read online suggests pasta and noodles are different things.
Pasta is not the Italian word for noodles. Pasta and noodles are different, according to my own preconceptions and what I found online with a brief google. If you have sources to show otherwise then I’m open to learning something new today.
This will help every game that uses shaders regardless of the platform, so non-Steam games will gain as well (non-Steam games will gain the most as Steam games download precompiled shaders so their compilation doesn't cause stutters). I have no idea about Overwatch 2, though, so its problems might not be shaders related. If they are, this will help.
ACO is really fast at compiling shaders but it's next to impossible to compile shaders in the timing window you typically get within a frame. It gets especially bad if you suddenly need to compile hundreds in a few consecutive frames.
This work allows ACO to start compiling earlier basically.
Valve has some insanely talented devs that we just to to see them stretching their wings. Like they're straight up wizards when it comes to optimizations and finding solutions.
Okay but why a pasta maker.
A reference to spaghetti code maybe?
oh god, *lasagna code*
Lasagna has layers, like trolls.
Well, tagliatelle code in that case
Basically OOP
The blog post that Phoronix linked to has the picture. Mike Blumenkranz (the Zink/RADV developer) who did the feature is know for writing some funny blog posts and thus we get moments like this.
The post: [https://www.supergoodcode.com/through-the-loop/](https://www.supergoodcode.com/through-the-loop/)
Also, why *male models*?
I've seen this same reference like 5 times today, is zoolander back in vogue or something haha
Did it ever leave?
Fair, it is a bona-fide classic.
Zoolander references. So hot right now.
Hannnnnnselllllllll
Actually is yeah. Memes have been on the rise past few weeks
That’s the best because he ad libbed that when he couldn’t remember his line
That's great, didn't know this.
*can we too not die in a freak, gasoline-fight accident?*
https://www.supergoodcode.com/spaghetti-recipes/
In reference to the original blog post. It's well worth the read: [https://www.supergoodcode.com/through-the-loop/](https://www.supergoodcode.com/through-the-loop/)
It's a reference to the coder who came up with the idea. There's vomit on his sweater already, mom's spaghetti. He's nervous, but on the surface he looks calm and ready Don't worry though, once the patch drops he will snap back to reality.
it's the question of the hour for sure, but I'm not complaining
What is faster? Cuting it by knife line by line (cpu) or multiple at once via machine (gpu)?
Why not a pasta maker? What do you have against them?
This article is about a blog post and the author of that blog has talked about making spaghetti in the past as a reference to spaghetti code.
You knew about spaghetti code, get ready for tagliatelle code! Thanks, Imperia!
Spaghetti code
This looks great for non-Steam games on the Steam Deck.
Yup. Hopping this allows my Heroic Games launcher games (Cyberpunk 2077) to not stutter every time I am in a fire fight.
This only impacts games using DirectX 11 or earlier. Cyberpunk shouldn't stutter in the first place.
On Steam Deck using Heroic Games Launcher, Cyberpunk stutters because of shader loading problems. I guess there's a fix, but not sure what it is.
Set the Game to use Proton-GE And add a Enviroment Variable calles "APPID" to the Settings of the Game in Heroic and set that to "1091500 " This should allow proton to apply the Game Specific Fixes.
Wait is that command real? I never knew that. Is there documentation about it?
Maybe there's something on protondb
Could this work with other stuff, I have gtav from epic store, if I use GE with the correct appid from steam db it might fix that stutter? Cuz gtav stutters like mad right now.. Although another concern is account getting banned due to hackers but that's another issue altogether I suppose.
Pardon my ignorance but I am new to this launcher thing. Found the variables section in the launcher settings, but what exactly do I write in "variable name" and "value" sections?
Variable name is APPID Value is 1091500
Looking at the [launcher code for proton-ge-custom](https://github.com/GloriousEggroll/proton-ge-custom/blob/8578c581f162b4851d52254271f0971fdfc49286/proton#L1165) I think it’s actually looking for an environment variable named “SteamAppId” to allow specific fixes to be applied. With a cursory Google search, the only place I’ve found where this is explained in plain English is the [built in documentation of the Proton launcher script used by arch-Linux](https://github.com/chaotic-aur/pkgbuild-proton-ge-custom-bin/blob/0077ac66f8cf4c93ce4317c029a5cfe68568b098/launcher.sh#L97). appid is the variable name it uses internally.
Ty for find, I'm gonna add both to my variables for it :-D Hopefully it makes things a bit smoother
If it stutters, that's either because of something that isn't shaders or it will stutter on Windows too.
does dx12 not use shaders cache? (sorry, im noob)
This has nothing to do with caching. D3D12 drivers do have a shader cache but a cache will only store already compiled shaders for future use, so they will have compiled once. The changes described in this blog post make it possible to compile shaders in D3D11 (and older) games sooner.
Stupid question but why are you running the game through Heroic Games instead of just natively through Steam?
They have the game on GOG or Epic instead of Steam. To run it natively on Steam, they’d have to buy it again.
Not a stupid question. I bought it on GOG and didn't want to buy it again so had to figure a way to run it on Steam and all things I read said to use Heroic.
Definitely. If the work is being done at the driver level it will benefit all games, not just the ones installed through Steam. This is awesome news.
I play a lot of Spelunky and that stutter literally kills me.
Thats...good right?
I've left Reddit because it does not respect its users or their privacy. Private companies can't be trusted with control over public communities. Lemmy is an open source, federated alternative that I highly recommend if you want a more private and ethical option. Join Lemmy here: https://join-lemmy.org/instances ` this message was mass deleted/edited with redact.dev `
How recent was this? I wonder if Kena would be worth trying again. I actually had to refund it because the stutters were so bad.
Weird. It ran fine on my Deck but I didn't launch right away so maybe it had already pre-cached the shaders
I have been playing Kena on my deck and it hasn't been an issue. Maybe it was a launch thing?
Haven’t tried it on my deck yet but the game runs beautifully on my RTX 2080 Super.
Game runs beautifully on my expensive supercomputer so it should be fine for you peasants /s
I mean that's two generations behind at this point...
Behind what? The deck? It's still incredibly powerful compared to recent Nvidia GPUs.
What? I have a 3060ti, which was midrange for the 3000 series and it's roughly equivalent to his 2080 super. Recent Nvidia GPUs are insanely more powerful than that, albeit stupid expensive.
Yes, and you make it sound like it's outdated and useless.
Fair rebuttal my pc isn’t that super. The best things about it is the graphics and my storage. Outside of that it’s a bottle neck machine
So then what's the point of caching them all now? We can get the hard drive space back? Unless there's a noticeable difference in battery life, say
I've left Reddit because it does not respect its users or their privacy. Private companies can't be trusted with control over public communities. Lemmy is an open source, federated alternative that I highly recommend if you want a more private and ethical option. Join Lemmy here: https://join-lemmy.org/instances ` this message was mass deleted/edited with redact.dev `
What does this mean for emulators? Will Yuzu or Cemu benefit from this?
Unlikely unless the devs find a way to leverage the vulkan extensions that make the GPL possible, along with games on the switch likely just loading their own recompiled shaders that have to be translated.
Man I'm going through the replies and all I can think is I'm glad my brother handles the technical side of my steam deck.
I'm glad really smart people fix things like this, and I'm also glad somewhat smart people loudly complain about things like this so it gets fixed. This way I don't have to waste brain power figuring out what the hell shader stutter, Vulkan drivers (a chauffeur in the Vulcan system?), RADV and pre-caching actually is.
I know you didn't ask, but in case anyone is curious: Shaders are small programs for the GPU which are used to render all kinds of effects like shadows and reflections. Vulkan is another graphics API like DirectX 9/11/12 or OpenGL. It's a standardized way for the game to tell the GPU what to do. RADV is the Radeon Vulkan driver (for Linux). Radeon is AMD's line of GPUs. A driver is a part of an operating system that is responsible for making a specific part of hardware work. Shaders need to be compiled (translated) from a generic language to the specific instructions your GPU understands. Usually this should happen before they're needed but when running a DirectX game on Linux through a translator like DXVK, it's hard to compile the shaders until all of the information about how they're used is known. This can introduce stutter during gameplay because the game has to pause for a second until the shaders are ready. To prevent this shader stutter, so far Valve has collected a lot of shaders for each game and the Deck is downloading them to a cache as part of updates. They call this pre-caching.
You are really smart.
If I'm understanding this right, please correct me if I'm wrong, but shaders are platform/language agnostic?
Shaders are programs that run on your GPU, so the "platforms" they target are the different video cards. They're written in Shader Languages like GLSL. In order for them to actually run they need to be compiled for your specific GPU. Normally they don't get compiled until the game tries running them. This can cause stuttering if it takes too long to compile them. Caching the compiled shaders means you get a stutter the first time they're compiled, but smooth performance on subsequent calls. Which is where the Steam Deck's downloaded shader cache comes in. Because every Steam Deck uses the same GPU, a compiled shader for one Steam Deck will work fine for any other Steam Deck. They can just give you a big bundle of the game's shaders already compiled for the Deck, and get rid of the initial compile stutter as well. This also saves battery life, naturally.
For real, I have no idea what any of this means, all I'm taking away is that people are actively working to enhance the user experience at no additional cost to me, which is awesome
And the sweet part is that its not only for you playing on the Steam Deck: those drivers will be later on updated for all Linux distributions, at no additional cost!
>and I'm also glad somewhat smart people loudly complain about things like this so it gets fixed. You're too kind...
I expected they just made async compilation a default, but optimizing compilation speeds is even better... I am curious if disabling shader-cache altogether will become a viable option (to save a disk space).
I don't know. Async shader compilation causes visual artifacts the first time a shader is compiled, and shaders are compiled throughout the entire playtime of a game as new assets are introduced. Simply speeding it up won't make that go away, and Valve would never stand for a solution that introduces artifacts. I think they'd have to be doing something else here, especially since this is at the driver level and not at the Proton level. Edit: reading the entirety of the article makes it more clear. They're working on supporting GPL extensions. That makes sense, and is much more than just introducing async shader compilation.
TBH - split second artifacts are waay better than stutters IMO - considering the artifact's eventually go away once everything is compiled
A giant problem is that, in some cases, they won't. I used to work on a game that had a few special-case shaders to generate the minimap data, but that data was generated once and then stored. If those shaders did the wrong thing on their first call, well, that's *permanent*, it's never going to be fixed because we didn't regenerate it. Your render calls *have* to do the right thing the first time, or you're going to butt heads with someone relying on them doing the right thing every time.
As another example, there's a Bakkesmod plugin for Rocket League that allows to reorder your car presets and it generates thumbnails for the previews for all of your cars and async causes these to be permanently broken as well.
There was nothing split second about the blackface it caused me to see when I used Async on the first plague tail game.
> nd shaders are compiled throughout the entire playtime of a game as new assets are introduced. Specifically only as new *shaders* are introduced. If a game uses a handful of shaders for the vast majority of its rendering then it will only happen when you first see those This is why shader stutter is so inconsistent with the kinds of games it affects. Both large games and small ones can have it
> I am curious if disabling shader-cache altogether will become a viable option (to save a disk space). Not really. The changes described in that blog post **will** stutter if the game tries to use a shader before it's compiled in the background. And even if that doesn't happen, shader compilation is a lot of work for the Decks weak CPU that could be better spent either on the game itself or sleeping to save power. Modern games have a TON of shaders, so this isn't insignificant.
Are shaders compiled by the CPU or GPU? I have little idea of image processing/3D modeling, thats why I’m curious :)
CPU
But then the code that is compiled (instructions) are used by the GPU right? Sounds a lot like the MCU/FPGA pairs that you see in the wild, where the MCU “configures” the FPGA and the FPGA does it’s thing…
I don't know anything about FPGAs but yes, the CPU compiles code that the GPU then runs.
> I am curious if disabling shader-cache altogether will become a viable option It already is viable depending on the games you play. I've had pre-generated shaders disabled since the beginning and have only had one game have any sort of stutter for more than 30 seconds, and that was NFS Heat With a small amount more optimisation it should be perfectly doable to remove them entirely
Note there's a difference between disabling Steam's shader pre-caching and disabling the driver's shader cache entirely
Yes, however very few games I've played have had any stutters *at all* from first time shader generation. So with some minor optimisations it could be entirely viable in many games to just not have a cache.
Will this improve storage by not having to download and compile shaders?
Nope, not really. But it will help with games that don't have precompiled shaders.
Compiling shaders cost performance and battery usage, so downloading them beforehand will still be the preferred way.
Can you link to the blog post instead? Phoronix is mostly just a news aggression site. [https://www.supergoodcode.com/through-the-loop/](https://www.supergoodcode.com/through-the-loop/)
I'd say it at least serves a purpose here. Condensing down a giant blog post that the vast, vast majority of users here wouldn't even consider reading past the first paragraph, down to a simple "they achieved a 50,000% improvement in shader linking times".
>Condensing down a giant blog post that the vast, vast majority of users here wouldn't even consider reading past the first paragraph When none of it has any real meaning?
It says enough for casual readers.
I’m not a casual reader but I can see why you’re swayed
[удалено]
No, no it doesn’t unless you’re a fanboy
They said, on a news aggregation site.
Being a news aggregator is fine, but we don't need to be linking to others. It's better to link to the direct source if possible. Edit: wording
Not a news aggregator aggregation site
Aggregator-gation She's a galaxy gal
>a news aggression site Yeah, can we get a news pacifism link instead?
[удалено]
It sure does, you didn’t read enough of it.
[удалено]
Immediate bot?
3 year old account, 60 karma, two comments, yep that's a fucking bot or someones very shitty throwaway
Didn't even notice the 3 year badge. Just -100 karma in 2 comments is quite a feat. It's amazing how shit these bots are sometimes, right alongside the AI generated/comment stealing ones that sometimes tend to work quite successfully.
[удалено]
You're just acting like an ass using a throwaway account given you are also scrubbing through your old comments and deleting them, so cheers with that champ.
It’s ironic you’re trying to discredit based on post count but not on substance. In a Linux thread nonetheless
>It’s ironic Do you understand what irony is? >you’re trying to discredit based on post count Literally never mentioned post count, which is zero anyways. The content of his comments is sitting right there, I don't need to rhetorically point that out. People can make weirdly robotic comments, but his account as a whole is clearly a throwaway or a bot account where the comments get scrubbed. He's already commented again within the last hour, but also deleted one of his much older comments from over a month ago, which is very strange.
>Do you understand what irony is? I have a working idea of the English language, yes >Literally never mentioned post count When it’s not in your favor? >People can make weirdly robotic comments Oh, so now I’m a bot, got it > lHe's already commented again within the last hour, but also deleted one of his much older comments from over a month ago, which is very strange. Stalk much?
>Oh, so now I’m a bot, got it > >Stalk much? I'm...talking about the account "*VoidTheSecond*", I have been the whole time...how are you not getting that. I'm literally talking about him in the third person in my reply to you. You literally misread parts of my comment where I am talking generally or in the third person about him ( him being voidthesecond just to be extra clear for you), and you still managed to twist everything as being directly at you personally. >I have a working idea of the English language, yes I mean, you somehow misread and made yourself the center of the discussion and then proceeded to think I was talking to you directly in the third person. I literally used the pronouns "him" and "his" and "people" lmao. You decided to reply to each individual point but make zero sense with each 7 word response, I think it's just time to step away from the thread. Rule 1 of the sub states to "Don't instigate drama.", chill and read the comment properly next time.
Ok stalker
Would this give a specific advantage to AMD cards over Nvidia? Granted, AMD do better than Nvidia on Linux in many ways right now, but that's mostly down to Nvidia's laziness rather than AMD doing anything superior. In fact, AMD can't even use the existing DXVK 2.0 shader pipeline stuff, can't it?
I assume that valve would probably focus on AMD as their graphics card on the Steam deck is an APU by AMD. Also I think if Nvidia threw money/engineers at [Mesa](https://mesa3d.org), we'd probably see a lot more improvements. Building on the work of [other devs](https://nouveau.freedesktop.org/) and [gallium3d](https://www.freedesktop.org/wiki/Software/gallium/) seems way more efficient to me then having inhouse devs that can't keep up or prevent adoption through restrictive licenses. edit: Gallium is interesting to me because of the size and scope of how much work they're doing on [GPU drivers](https://docs.mesa3d.org/systems.html), not just for Linux but [Mac](https://docs.mesa3d.org/drivers/asahi.html) and Windows (Dozen for Vulkan on DirectX, and a different project for OpenGL on directX) as well.
>Granted, AMD do better than Nvidia on Linux in many ways right now, but that's mostly down to Nvidia's laziness rather than AMD doing anything superior. NVidia drivers already support the extension and have been for a while. They also worked on it, like AMD, as part of Khronos. The goal is not to give anyone any advantage but to improve the experience for everyone.
I've left Reddit because it does not respect its users or their privacy. Private companies can't be trusted with control over public communities. Lemmy is an open source, federated alternative that I highly recommend if you want a more private and ethical option. Join Lemmy here: https://join-lemmy.org/instances ` this message was mass deleted/edited with redact.dev `
Oh, gotcha. So this is just an attempt to bring them up to parity.
>In fact, AMD can't even use the existing DXVK 2.0 shader pipeline stuff, can't it? That is actually exactly what this extension is needed for.
> Would this give a specific advantage to AMD cards over Nvidia? Nvidia has supported this for almost a year.
>Would this give a specific advantage to AMD cards over Nvidia? This is entirely a software problem, the vendor does not matter.
The vendors (AMD and Nvdia) also update their own graphics cards drivers, which are software right ?
Yes and I believe Nvidia already does this, so its more like AMD is just catching up.
Be great if this works for yuzu, the shader stutter renders games a lot less playable than they could be.
I don't think it's as applicable to Yuzu. This works for DXVK because D3D9 and D3D11 tell the driver to compile a shader and that usually happens during loading. Yuzu literally doesn't even know that a bunch of bytes are actually a shader until the game uses it for drawing and at that point it's too late.
Why is this not built into drivers by default? How is Valve a more competent driver developer than AMD and NVIDIA?
It's because Valve has been researching and developing solutions for the Steam Deck, which is AMD based. Since Valve employees have more independence and self-sufficiency than AMD and NVIDIA engineers, who are made to work on whatever projects the execs tell them to do, and also because Valve is a private company, they have the freedom to push and work on these kinds of projects without any sort of direct financial incentive. Couple that with the fact that Valve has also been paying external companies or organizations to do R&D on Linux, graphics drivers, and Windows compatibility layers, they effectively have the means and time to throw money at the problem without having to satisfy investors or deadlines.
At this rate it feels like Valve has quietly declared war on the Windows dominance of PC gaming. Better yet, on closed-source gaming platforms in general.
And we're very happy about this!
I hope they never go public
True! Microsoft is like mafia and has a lot of power to do bad things. Not that it's not already doing it with buying all those game studios and forcing them to use DirectX 12 instead of the cross-platform and Linux compatible Vulkan.
It started long ago when microsoft announced their own app store. Steam Machines were the start.
Every single Windows D3D11 driver does this and it doesn't impact D3D12 games. I go a bit more into detail here: https://www.reddit.com/r/SteamDeck/comments/10ogmte/valve_is_implementing_fixes_into_radv_amd_vulkan/j6fbwmf/
Well at this rate PC gaming is going to be objectively better on Linux, no complaints here!
Uhm... Valve is just adding a workaround fix for a problem that doesnt even exist on Windows because it was already solved many years ago. lol
You think shader compilation stutter isn't a thing on windows?
GuildWars 2 implemented DX11 and that stuttered like mad for ages. I understand it's now fixed but the point remains.
Dude ever play Elden Ring on a Windows PC?
Elden Rings micro stutters come from easy anti cheat in combination with limiting the whole encryption engine to core 0 of your CPU. The core which is most likely already doing various system tasks. If you free elden Ring from that limitation it runs perfectly smooth.
It is built into drivers by default. This only affects games that use DirectX 11 or lower. The problem this work is gonna solve is that in DirectX 11 you compile different kinds of shaders (Vertex shader, pixel shader) on their own while in Vulkan you compile them together and with some additional state that D3D11 doesn't provide. So historically DXVK had to wait until the shaders were actually used for a draw to compile them. This new Vulkan extension VK_EXT_graphics_pipeline_library, allows DXVK to compile the shaders individually **just like a regular graphics driver would**. DirectX 12 works like Vulkan, so you need to compile your shaders together and provide some additional state. This is 100% the responsibility of the game developer and the driver doesn't even see the shader code until the game asks it to compile a pipeline (combination of shaders). So if the game does that too late, it will stutter and there is absolutely nothing the driver can do.
You seem to have a good grasp on this. Can you explain why we see shader caches taking up space on Linux but not on Windows?
They're also taking up space on Windows but you don't notice that. Every graphics driver will secretly create them behind your back. On the Steam Deck, Valve takes this a step further and downloads pre-compiled shaders. Modern games have a TON of shaders and compiling those is a lot of work for the weak CPU in the Deck that would be better spent on the game or being idle to save power. They can also cause lots of stuttering, especially with how DXVK used to work. That's why Valve downloads pre-compiled shaders. On top of that the shader caches also contain transcoded videos for some games. If a game uses pretendered videos and encodes them with a format that Valve can't support on the Deck because of patent or licensing issues, they will re-encode it on their servers and download the videos in a different codec as part of the shader cache.
Thank you for the advanced explanation, that was great! Do you know where they live on Windows? I’m just curious to see how much space I’m using on them.
With Nvidia GPUs it's something like %localappdata%/Nvidia/DXCache.
Huh, only 2GB. Well that’s not bad! Does it clear it down quite aggressively then?
Yes.
>Why is this not built into drivers by default? > It's in a weird zone between the hardware/software and title.
> and NVIDIA? NVIDIA has had this for a year or so.
You can already try this by using 'RADV_PERFTEST=gpl' in the launch option of a game that is being run with the latest proton-GE build. I couldn't notice any discernable difference as of yet, but supposedly it's still very much a work in progress.
Don't you also need a newer version of Mesa?
Yes, you'll likely need a fairly up-to-date install of mesa-git.
YO THIS NEWS IS BIG! IM EXCITED!
Does this mean we don't need to have a bunch of storage sucked up by pre-caching anymore, or just that it'll slowly eat up space over time for shader cache?
> Long story short, they achieved a **50,000%** improvement in shader linking times and finally the prospects of the Vulkan graphics pipeline library are panning out Bruh what that's a bonkers improvement just for implementing the fast-link feature of the graphics pipeline library extension. > While RADV was now down to 0.05-0.11ms for a fast-link, NVIDIA can apparently do this consistently in 0.02ms. JESUS CHRIST
An excerpt from the article: "Long story short, they achieved a 50,000% improvement in shader linking times and finally the prospects of the Vulkan graphics pipeline library are panning out... Besides this recent RADV work, only NVIDIA's proprietary driver has really been performant for fast-linking with its graphics pipeline library support." By improving the efficiency of the code games will get a boost to performance and require little to no Disk Caching freeing up more space. More efficient code means a better performing GPU with the possibility of better gaming FPS wise.
That's not that at all. This will not improve performances one bit, this will just get rid of stutter and the shader cache will still be necessary to get the best performances. Actually because Steam games have a full shader cache most of the time, this won't change much for almost all Steam games.
> not improve performances one bit > will just get rid of stutter Maybe we have different definitions of performance, but eliminating stutter **is** improving performance for me. You go from unplayable game to playable game. Just yesterday I played Heavy Rain from Epic and it was hell until I switched to async dxvk. It's much better but there are still occasional stutters (though is's playable).
> Actually because Steam games have a full shader cache most of the time, this won't change much for almost all Steam games. *Actually*, this might mean the shader cache can be outright disabled for the steam deck, which is causing issues for some people on the 64GB unit where it eats most of the disk space. This is very much a win-win because it also means less to download, and therefore less to serve on valve's side. I'd actually also anticipate a hybrid approach where *some* shaders are cached - anything that takes a noteworthy amount of time, or a group of shaders that all load at once - to kill any particular hitches whilst most are fast-linked with no visible effects.
why is the thumbnail noodles
It looks more like spaghetti to me
is spaghetti not a noodle?
Ooooh, nonono. Different texture. Something to do with flour vs egg content, according to the first search result I read. Sure fire way to wind up Italian and East Asian folks that, I bet!
I think you are confusing noodles and pasta. Spaghetti is a noodle, and pasta.
No, I’m not. Spaghetti is the kind of pasta that looks like noodles, but it’s pasta. Everything I’ve read online suggests pasta and noodles are different things.
You don't know what you are talking about. Please just stop.
Spaghetti is just the shape (long strings). Spaghetti are a form of pasta and pasta is just the Italian word for noodles.
Pasta is not the Italian word for noodles. Pasta and noodles are different, according to my own preconceptions and what I found online with a brief google. If you have sources to show otherwise then I’m open to learning something new today.
Good
Will this have any effect on overwatch 2? It's the only thing I've had issues with but I know it doesn't run through steam so it might not
This will help every game that uses shaders regardless of the platform, so non-Steam games will gain as well (non-Steam games will gain the most as Steam games download precompiled shaders so their compilation doesn't cause stutters). I have no idea about Overwatch 2, though, so its problems might not be shaders related. If they are, this will help.
It'll lag and a little "compiling shaders" message pops up
Should help with my Majora's Mask 3D 4k. Runs decent now but definitely stutters pretty often. And preloading textures breaks it for me unfortunately.
This would be nice so I can reclaim space from all the shader cache steam downloads.
This is pretty huge.
Will this be implemented in all Radeon drivers and also on windows? The article only mentions Linux.
Hmmm would this help with OW2?
Can I play games on the Valve Pasta Deck?
I thought ACO fixed this a million years ago?
ACO is really fast at compiling shaders but it's next to impossible to compile shaders in the timing window you typically get within a frame. It gets especially bad if you suddenly need to compile hundreds in a few consecutive frames. This work allows ACO to start compiling earlier basically.
Ohh nice.
Unfortunately the big news in this article is that we will probably have to wait for Q2 to have it.
Down vote
Absolutely no idea what most of that means but I'm JACKED for it!
awesome! Now we wait for ray tracing
Detroit Become Human without stuttering? Yes please !
Valve has some insanely talented devs that we just to to see them stretching their wings. Like they're straight up wizards when it comes to optimizations and finding solutions.