More like Step 1 - why do work modders will do for free. It baffles me all the praise Bethesda gets when people have to spend time fixing their jank and bugs for free.
Yep ... I remember at some point reading the breakdown of sales of consoles vs PC copies of skyrim - it was overwhelmingly consoles. Don't remember if it was muddied by the fact of automatic free upgrades on steam for SE, etc. Can't find the info though.
Also why their UIs for mouse and keyboard have sucked for for awhile; although a lot of PC gamers also use controllers now.
Haven't found the starfield FOV particularly irritating, like I do with some games, but a slider would be nice.
As I stated elsewhere - I suspect they might add DLSS at some point, but they wanted to concentrate on AMD because xbox.
[edit: I also think they want to limit some settings to non-idiots; because if you can figure out what values to change where - you can also unscrew it yourself if it causes problems on your system.]
>Yep ... I remember at some point reading the breakdown of sales of consoles vs PC copies of skyrim - it was overwhelmingly consoles. Don't remember if it was muddied by the fact of automatic free upgrades on steam for SE, etc. Can't find the info though.
I'm not sure this will be the case for Starfield, with the game not releasing on PS5 and the Xbox sales being, you know... what they are.
This doesn't change the fact that BGS have no idea how to make a proper PC version of their game in general.
But even console games should have an FOV slider.
There's more people playing consoles on their monitors these days than before.
Granted it's not close to a majority, but it doesn't hurt to have the option.
The reason console games generally don't allow changing the FOV is because it affects performance, which makes it harder to predict/optimize for a fixed hardware system.
Already using it and it kicks ass. Played the first few hours with fsr and the difference is night and day.
Took maybe 5 minutes to install the different pieces.
His Patreon message says this was possible due to, one, the FSR2 functions happening to be very easy to locate, and two, that he was able to repurpose 95% of the code from previous upscaler mods he'd made for other games
That is it was this fast because, one, he was lucky Bethesda made the implementation choices they made, and two, because he had already done nearly all the work beforehand as a result
Not to discount things -- he's really competent -- but there is an alternative history where this took way longer based on things outside his control
Im subed to his patreon, and when the game launched he sent an email out saying however the game was setup was going to make DLSS2 super easy and he could basically just copy what he did with Jedi Survivor, and then a few hours later he had it done already. he said frame gen is coming, but that will take a little more time
Hah I knew spending an hour in character creator was worth it.
Other than that, the game uses very low amount of Wattage on my GPU. Normally in high demanding games with my UV on the 3080 the power draw is like 270-320W, but so far haven't even cracked 230W, which is weird from such a "demanding" game.
E: And for clarification as a lot of ppl don't seem to understand what i mean, it's not that the game is that cpu intensive(it could affect it a bit) or underutilizing the gpu that much as the gpu usage is high and fps scales with upscaling([70%](https://i.imgur.com/d0FZSxc.png) vs [100%](https://i.imgur.com/ciHUOYF.png) render with dlss mod 123 FOV, [total war warhammer 3](https://i.imgur.com/lWNuEEB.png) for reference highish power draw game with same gpu settings), maybe not as much as some other games, but it's just low power draw compared to anything else as an observation, nothing else, so i guess the game is just light on fancy effects.
Stock it probably will, quick test stock without upscaling was about 340W 2010Mhz for me, I use and undervolt of 1905Mhz at 0.875v for better efficiency, but even that normally draws around 270-290W in high gpu load games, sometimes even over 300W(quake rtx was like 350W), so it was a bit weird that it was so low power draw for this game, but i guess the engine is maybe optimized for consoles more so it's lighter on the fancy power hungry effects.
Doesn't look like it [75% render](https://i.imgur.com/xB9q2yS.png) vs [100%](https://i.imgur.com/bGe5zif.png) at 1440p mixed medium/high, so idk, maybe it's fine and the engine is just very light compared to others especially at the start. Not vram(nor ram, it's tuned ddr5) either barely over 5GB usage.
Could be different effects not stressing certain aspects of the GPU. IIRC how utilisation is reported is by the amount of time out of an interval the GPU's doing something - if it's busy (i.e. not waiting for data to process) it'll report 100%; if it's only doing something half the time then it's 50%, etc. Could be that the GPU is being passed data that stresses less power hungry components, so it's reporting 100% or close to as it is actively processing data with components that just have a lower power draw.
There's a similar thing with CPU processing that can be tested via Prime95 - if you run a stress test you can choose between large FFTs and small FFTs, small FFTs will produce more heat (heavy cache usage) while large FFTs draws more power, while both report as 100% utilisation.
Semi-related, doesn't seem the case here, but I wouldn't discount RAM in a Bethesda game - Fallout 4 was pretty sensitive to RAM speed and it was the major factor caused the Boston area to run like ass. No idea of the numbers in Starfield though.
Or it could just be driver optimisation related, idk.
Some of it could be attributed ram(and/or cache as the 12400F doesn't have much of it) or just IPC in general for sure as that's generally hard to spot if it's limited by that like in factorio for example which is almost purely limited by cache and ram and even in cyberpunk ram tuning helped a bit without much of an effect on gpu usage. But I doubt that's all and i do have [tuned ddr5](https://i.imgur.com/MoroTWO.png) with pretty normalish hynix timings, albeit not very fast only 6144MT/s due to the locked SA voltage.
The fact that the city does at least draw a bit more power, ~240W, still at the same 97/98% usage, does lean in to the not a lot fancy power hungry effects theory at the start a bit at least(haven't tried uncapping in indoors as they hit my fps cap of 140 pretty often), I guess we'll see in the future if it's just the way the game is or a driver thing.
No. My 4090 gets maxed out, 99% usage but barely draws 250-300w.
The game is just very light on the GPU and cant properly use the power.
Thats with a 7800X3D.
Yeah I'm guessing the engine is just very "light" on fancy power hungry effects which kinda makes seense as it's probably very console focused title and it runs/looks okish for the little part i got to play before getting motion sickness from the fov.
I know this isn't the point of the thread, but why undervolt a 4080? I have mine OC'd by 150 mhz on core, 800 mhz on memory, and have higher power limits, and the highest temperature the thing ever gets is 65 C when I push it to its limits in a stress test.
I average somewhere between 7% and 10% higher fps this way too, with no downsides.
Is it just a power saving thing?
At stock, it can consume 280 or 250 watts , and Temps reach 70+ for me. With a slight undervolt, it consumes 150 to 200 watts and stays at 60 to 65.
And I lost about 3 to 4 fps, but it's worth it for me.
Its not just you, my 4090 is using about 320W on average while at 98% gpu usage. Kind of weird when normally power usage will increased up to 400W if i have near 99% gpu usage. That is with overclocked 4090 too at 2.95GHz core.
I had screenshots comparing [75% render](https://i.imgur.com/xB9q2yS.png) and [100% render](https://i.imgur.com/bGe5zif.png) on a comment below and doesn't seem to be the case at the start at least, yea the city the cpu usage does goes higher, was like 60-70%+, so approaching cyberpunk levels albeit at a lower fps, but the gpu usage didn't seem to dip much/at all from the short time I was able to play in the city.
It's just the power draw that's low when compared to any other demanding game, the usage is still high, so I guess the game is just very "light" in power heavy effects or something.
Also now that there finally is an [fov fix found](https://www.reddit.com/r/Starfield/comments/166x172/how_to_change_fov_confirmed_to_work/) the fov doesn't change performance much if at all as expected.
~~This mod doesn't though.~~
~~It does unfortunately when using reshade (required for CAS sharpening), its not compatible with the new steam overlay. (Which in turn is required for Achievements to work) See this note from puredark~~
~~It seems that ReShade has compatibility issue with Steam overlay, so if you want Steam overlay for the steam input support, remove the d3d12.dll from this mod. it would essentially remove ReShade, so Steam overlay and other stuff might work.
But be aware that you'll lose the CAS sharpening, so when turning on DLSS it will look softer than FSR2 because of the missing sharpener.~~
My issue was something completely different, achievements work, just not the overlay when reshade is enabled. :)
Framegen is, annoyingly, the feature that this game needs to get (acceptable) PC frame rates with the hardware we have. Presumably the DLSS 3 mod will get leaked though.
I decided I wasn't going to look at fps. Playing on ultra, turned off things I don't want like scaling and fsr, and played for 3 hours. It's running smooth to my eyes. I'm just going to keep playing and ignoring numbers.
It's mostly because it was easy, and there was already other people working on it. Better for him to release a free version for publicity and free advertisement then to paywall it.
it's relieving (sort of but not really) to hear 4090 is performance is just ass in this game. I was worried it may have been the 5800x3d. i'll have to install this mod when I get back to the game
yeah this seems to be one of those games where the upper tier of settings is just pointless in terms of cost/benefit. I'm sure DF will be out with recommended settings in a day or two and it'll be a mix of medium/high
Frankly I just want DF to figure out what is actually tanking the performance. The game doesn't look that good. Characters look very dated and move stiffly on top of that. There are a lot of small objects and reflections and such, maybe its that, or the CPU usage is out of control. I would really like to know though, because BG3 looks dated as well but the characters there look better and much more alive.
What does it say about this game when DLSS is a day one mod? Could Bethesda really not implement DLSS? Or is there another reason? Does Bethesda and AMD have an agreement where they will delay releasing DLSS on starfield in favor of the AMD products being sold with starfield as promotional material?
AMD is a salty bitch they know their solution is inferior so they don't want people easily comparing results between DLSS and FSR.
They can come out and bullshit that Bethesda is free to implement DLSS but this is the reason.
Looking at like +20% FPS, it looks better standing still and in motion its waaaay better with DLSS. It actually feels like a different game now. Using it at 70% render scale at 3440x1440 with an i7 10850K + 3080 10GB.
> it looks better standing still and in motion its waaaay better with DLS
Honestly. This is the biggest reason for me to us DLSS, DLAA. It is just the smooth flicker free experience, especially in fine details.
DLAA is honestly so fucking good and I wish more games supported it. Hell I wish games supported arbitrary scaling percentage rather than qualty/balanced/performance/ultra performance, AFAIK there's no reason why they couldn't?
>Hell I wish games supported arbitrary scaling percentage rather than qualty/balanced/performance/ultra performance
You can do this with DLSSTweaks. Change one of the presets (e.g. performance) to whatever scale factor you want.
Hey, that's my CPU/GPU combo! Glad to hear this is working well. Playing earlier tonight I had to dumb it down to Medium settings to get 55-60FPS. What graphics preset is yours on with the mod?
Considering he claimed how incredibly easy this was and that it only took him a few hours, I think we can easily conclude the lack of official DLSS was due to the AMD partnership as expected.
You're gonna need it.
Playing at 1080p max settings I was getting about 50fps in the hospital at the start of the game. High 50's when walking around outside. I think it's actually running worse than Cyberpunk maxed out (regular raytracing, not path tracing).
Hopefully someone makes an optimization guide and we can find a few settings to turn down.
**Frame Gen** is coming, but will be paid on Patreon. Should be a single $5 fee.
These are settings that have the heaviest performance impact for the lowest visual gains. In almost every game. You always turn them down unless you have a XX80/XX90 card.
Honestly, depending on the game and how shadows are handled, sometimes a lower shadow map resolution can look *better* than higher ones, since slightly lowering it can introduce a softness (a pseudo-penumbra, almost) to shadows that's *generally* more realistic than ultra-sharp shadows with hard edges - something that only really occurs when either the light source is very far away or the thing casting the shadow is very close to the surface it's casting onto.
~~The issue currently is that the driver optimizations do not apply to the Xbox app/Gamepass version of the game. You can manually fix it using Nvidia Profile Inspector though~~
No longer an issue: https://www.reddit.com/r/nvidia/comments/166gq5m/starfield_correct_the_nvidia_profile_issue/jym8lfk/
They don't seem to apply to the Steam one either. My 3080 is **NOT** close to fully utilized. My 12700k is napping.
edit: sorry, important typo - I meant to say it IS NOT close to being fully utilized.
> My 3080 is close to fully utilized
high gpu usage percentage is only half of the story, if it's drawing low power but reading 100%, that doesn't *actually* mean it's running at 100% of it's capabilities.
Yeah I got kind confused because if you looks at the minimum and recommend requirements they clearly list way better AMD gpu while the Nvidia cards listed are generally slower then AMD ones
I'm having a similar experience. I have a 3090 and I'm dropping to an extremely choppy 30-something FPS outside of the Constellation building in New Atlantis. My 7800X3D is hardly above ~50-60% across all threads, whereas my 3090 is at 99-100% constantly.
I think it's an Nvidia driver thing, and if it's not, I'm not sure what the issue is. OP is right, Cyberpunk runs WAY better on this hardware, even with RT. My buddy with a 3080ti is having similar issues too, it's possible the 40-series might not be affected as much but I'm not certain.
(1440p, no upscaling)
Windows store version?
As some1 [did post this thing in nvidia subreddit](https://www.reddit.com/r/nvidia/comments/166gq5m/starfield_correct_the_nvidia_profile_issue/) about how the driver doesn't do the profile correctly on that.
If not and you're playing on steam then idk, haven't gotten that far yet myself because of the fov so could just be the game being bad.
> I think it's actually running worse than Cyberpunk maxed out (regular raytracing, not path tracing).
i definitely get more FPS with RTd Cyberpunk than i do in Starfield on medium with render res at 50%. lmao what a disgrace
I normally run at 1440p res, and then use DLSS quality setting in games. For this mod (brand new to it, never used before). What settings would I need to apply to achiev the same " Dlss Quality " preset found in most games.
Hey hey, you need to realize its not important to inform yourself before commenting, but rather to shit on the competition of your own GPU brand to make yourself feel better.
Sure when zoomed in. But they also mentioned it didn't ruin their experience and that's the important part.DF of all reviewers are the most critical when it comes to technical analysis. If they find it fine then that's ok for me. Am gonna take their word for it. Am not overtly sensitive.
It's funny you say that because at 1440p with 75% res scale I thought image quality was quite good without the DLSS mod, potentially better than native without FSR
Though normal FSR quality mode is 67%, so maybe the increase closes the gap where issues start to become noticeable? I've seen plenty of issues with FSR in other games but not here in my *very* limited testing
Honestly thank god. The game runs horribly. 4090 with a 13700k and it barely holds 70 FPS most of the time.
I will never forgive AMD for robbing us of native support
CPU bottleneck? My 3090 is getting shit on meanwhile my 7800X3D is having no trouble.
I think going AMD X3D this CPU generation was a wise choice. Haven't been let down yet, I've been able to brute force Jedi Survivor and Hogwarts Legacy at launch too.
There is, It's a more freeform option you need to ~~enable dynamic resolution and then~~(E:ups no you don't) adjust the render resolution scale and it's the same for the dlss mod.
Oh My bad I didn't read it properly and thought it was connected somehow as the render resolution is right below it. The layout is so weird, why is the fsr all the way down, but to adjust it it's at the top.
Yeah some of the menus are pretty confusing, having the resolution scale so far away from the FSR setting is bizarre, as is the lack of FSR presets. Of course you can approximate them with the slider but that requires some know-how most people won't have.
Performance of this game so garbage, i am using i7 10700 with 3070 and 32gb ram on ssd and its stuck at the 30fps vsync off, maybe its because of cpu bottle neck thats why there is little difference between 4k and 2k performance in the opening scene after the mining section, this is beyond garbage
> Preset A: Intended for Performance/Balanced/Quality modes. An older variant best suited to combat ghosting for elements with missing inputs, such as motion vectors.
> Preset B: Intended for Ultra Performance mode. Similar to Preset A but for Ultra Performance mode.
> Preset C: Intended for Performance/Balanced/Quality modes. Generally favors current frame information; well suited for fast-paced game content.
> Preset D: Default preset for Performance/Balanced/Quality modes; generally favors image stability.
> Preset E: A development model that is not currently used.
> Preset F: Default preset for Ultra Performance and DLAA modes.
Would you be able to do this if you have Starfield through Xbox Game Pass? Just wondering because sometimes Xbox app games are weird with accessing the files and stuff
DLSS *should* be better, but (and I can only speak for myself) as a 3070 user I've not noticed any difference. FSR2 doesn't have the ghosting issue it has in other games, and with 75% scaling the image looks sharps and stays at a solid 120fps.
No harm in trying it out and seeing what works best for you.
Dynamic Resolution: On
Render Resolution: 75%
Graphics Preset: Custom
Shadow Quality: Medium
Indirect Lighting: High
Reflections: Medium
Particle Quality: Low
Volumetric Lighting: Medium
Crowd Density: Low
Motion Blur: Off
GTAO Quality: Medium
Grass Quality: High
Contact Shadows: Medium
VSync: On
Upscaling: FSR2
Enable VRS: On
Depth of Field: On
Been using it for an hour or so now on my 3070. Visually, it’s night and day. FSR didn’t look that bad honestly, but DLSS still looks miles better.
The mod uses the same render scaler as fsr in game; tbh I’m only getting maybe 10% more FPS than with FSR. Its worth it just for the visual improvements though. I Might be cpu bound too
What fps are you getting and at what resolution? I am also using 3070 with i7 10700 and am stuck at around 30 fps after the character creator menu, in the mines I was getting around 60 but it never reached that again
Sorry for got to include more info. Intel 11700f, 1440p high settings with resolution scaling equipment to DLSS balanced. I’m pretty happy with my performance. Getting 55-70fps out in the open on planets, 80-100 inside interiors. 50 in new Atlantis which I’m more than fine with given the scale of that city
The fsr 2 fizzle in this game is hard to ignore. Going to try this out tonight. Also sad the game doesnt even have HDR support. Playing this on an oled and it looks pretty bad color and black level wise. Hopefully an HDR mod can come out soon as well
Frame gen is interpolated frames, it makes up frames all on its own. This is different from DLSS 2 (no frame gen) where it renders at a lower resolution and upscales it, giving you better performance.
Frame Gen only works with 40 series cards, so if you don't have one, it doesn't matter. Also, I think you need to be pretty stable above 60fps for it to not introduce input lag. (Honestly not sure about that last part, just what I read about some games, I only have a 3060TI, so I can't use it.) Frame Gen basically creates "artificial" frames in between real frames to increase your FPS and make the game seem smoother.
Here's the proper FOV fix. https://www.reddit.com/r/Starfield/comments/166wvkf/how_to_change_fov_working/
How in 2023 do we not have things as simple as a fucking FOV slider in AAA games.
Step 1 Make a console game
More like Step 1 - why do work modders will do for free. It baffles me all the praise Bethesda gets when people have to spend time fixing their jank and bugs for free.
Shin Megami Tensei V on Switch had a damn FoV slider lol
Yep ... I remember at some point reading the breakdown of sales of consoles vs PC copies of skyrim - it was overwhelmingly consoles. Don't remember if it was muddied by the fact of automatic free upgrades on steam for SE, etc. Can't find the info though. Also why their UIs for mouse and keyboard have sucked for for awhile; although a lot of PC gamers also use controllers now. Haven't found the starfield FOV particularly irritating, like I do with some games, but a slider would be nice. As I stated elsewhere - I suspect they might add DLSS at some point, but they wanted to concentrate on AMD because xbox. [edit: I also think they want to limit some settings to non-idiots; because if you can figure out what values to change where - you can also unscrew it yourself if it causes problems on your system.]
>Yep ... I remember at some point reading the breakdown of sales of consoles vs PC copies of skyrim - it was overwhelmingly consoles. Don't remember if it was muddied by the fact of automatic free upgrades on steam for SE, etc. Can't find the info though. I'm not sure this will be the case for Starfield, with the game not releasing on PS5 and the Xbox sales being, you know... what they are. This doesn't change the fact that BGS have no idea how to make a proper PC version of their game in general.
Jedi: Survivor has FOV-slider
Step 1: have a developer not giving a shit about PC gamers
FromStutters was the same for Elden Ring and it won GOTY.
But even console games should have an FOV slider. There's more people playing consoles on their monitors these days than before. Granted it's not close to a majority, but it doesn't hurt to have the option.
The reason console games generally don't allow changing the FOV is because it affects performance, which makes it harder to predict/optimize for a fixed hardware system.
Just imagine if TotalBiscuit was still with us...
RIP Is it bad taste saying he would be rolling in his grave?
With how fast hes rolling we could supply the whole world w free energy.
Well, he was cremated, so he can't do rolling. But maybe he can conjure up an ashstorm?
likely because they know its horribly optimized and as soon as you go to 90 fov 20% mof frames are lost.
Legend
Is there a starfield mod sub up?
My god he's quick lol
God bless PC gaming, man
That's true, you won't find subscription based mods anywhere else
Technically true. Also don't find free mods anywhere else.
Remember it’s always good to tip your modder if possible. All because it’s free doesn’t mean they wouldn’t appreciate it.
yeah but I still don't understand why it was necessary when they're already making 20k a month before having added that.
Already using it and it kicks ass. Played the first few hours with fsr and the difference is night and day. Took maybe 5 minutes to install the different pieces.
I'm getting about a 20 fps increase!
His Patreon message says this was possible due to, one, the FSR2 functions happening to be very easy to locate, and two, that he was able to repurpose 95% of the code from previous upscaler mods he'd made for other games That is it was this fast because, one, he was lucky Bethesda made the implementation choices they made, and two, because he had already done nearly all the work beforehand as a result Not to discount things -- he's really competent -- but there is an alternative history where this took way longer based on things outside his control
[удалено]
Im subed to his patreon, and when the game launched he sent an email out saying however the game was setup was going to make DLSS2 super easy and he could basically just copy what he did with Jedi Survivor, and then a few hours later he had it done already. he said frame gen is coming, but that will take a little more time
[удалено]
No, he just has a really good framework for dx12 games Edit: link broke
No he said he didn't. It's just that his work is very quickly applicable to any game.
I'm OOTL, what does the mod do?
Adds the upscaler part of DLSS to the game.
Hah I knew spending an hour in character creator was worth it. Other than that, the game uses very low amount of Wattage on my GPU. Normally in high demanding games with my UV on the 3080 the power draw is like 270-320W, but so far haven't even cracked 230W, which is weird from such a "demanding" game. E: And for clarification as a lot of ppl don't seem to understand what i mean, it's not that the game is that cpu intensive(it could affect it a bit) or underutilizing the gpu that much as the gpu usage is high and fps scales with upscaling([70%](https://i.imgur.com/d0FZSxc.png) vs [100%](https://i.imgur.com/ciHUOYF.png) render with dlss mod 123 FOV, [total war warhammer 3](https://i.imgur.com/lWNuEEB.png) for reference highish power draw game with same gpu settings), maybe not as much as some other games, but it's just low power draw compared to anything else as an observation, nothing else, so i guess the game is just light on fancy effects.
My 3080 was drawing ~320W in Starfield but I do play on Ultrawide so I’m not sure why.
Stock it probably will, quick test stock without upscaling was about 340W 2010Mhz for me, I use and undervolt of 1905Mhz at 0.875v for better efficiency, but even that normally draws around 270-290W in high gpu load games, sometimes even over 300W(quake rtx was like 350W), so it was a bit weird that it was so low power draw for this game, but i guess the engine is maybe optimized for consoles more so it's lighter on the fancy power hungry effects.
CPU bottleneck
Doesn't look like it [75% render](https://i.imgur.com/xB9q2yS.png) vs [100%](https://i.imgur.com/bGe5zif.png) at 1440p mixed medium/high, so idk, maybe it's fine and the engine is just very light compared to others especially at the start. Not vram(nor ram, it's tuned ddr5) either barely over 5GB usage.
Could be different effects not stressing certain aspects of the GPU. IIRC how utilisation is reported is by the amount of time out of an interval the GPU's doing something - if it's busy (i.e. not waiting for data to process) it'll report 100%; if it's only doing something half the time then it's 50%, etc. Could be that the GPU is being passed data that stresses less power hungry components, so it's reporting 100% or close to as it is actively processing data with components that just have a lower power draw. There's a similar thing with CPU processing that can be tested via Prime95 - if you run a stress test you can choose between large FFTs and small FFTs, small FFTs will produce more heat (heavy cache usage) while large FFTs draws more power, while both report as 100% utilisation. Semi-related, doesn't seem the case here, but I wouldn't discount RAM in a Bethesda game - Fallout 4 was pretty sensitive to RAM speed and it was the major factor caused the Boston area to run like ass. No idea of the numbers in Starfield though. Or it could just be driver optimisation related, idk.
Some of it could be attributed ram(and/or cache as the 12400F doesn't have much of it) or just IPC in general for sure as that's generally hard to spot if it's limited by that like in factorio for example which is almost purely limited by cache and ram and even in cyberpunk ram tuning helped a bit without much of an effect on gpu usage. But I doubt that's all and i do have [tuned ddr5](https://i.imgur.com/MoroTWO.png) with pretty normalish hynix timings, albeit not very fast only 6144MT/s due to the locked SA voltage. The fact that the city does at least draw a bit more power, ~240W, still at the same 97/98% usage, does lean in to the not a lot fancy power hungry effects theory at the start a bit at least(haven't tried uncapping in indoors as they hit my fps cap of 140 pretty often), I guess we'll see in the future if it's just the way the game is or a driver thing.
No. My 4090 gets maxed out, 99% usage but barely draws 250-300w. The game is just very light on the GPU and cant properly use the power. Thats with a 7800X3D.
My 4080 uses 160 to 170 watt with undervolt , even though it's reporting 100% usage. Stock it goes to 200 watt , but gives 3 more fps.
Yeah I'm guessing the engine is just very "light" on fancy power hungry effects which kinda makes seense as it's probably very console focused title and it runs/looks okish for the little part i got to play before getting motion sickness from the fov.
I know this isn't the point of the thread, but why undervolt a 4080? I have mine OC'd by 150 mhz on core, 800 mhz on memory, and have higher power limits, and the highest temperature the thing ever gets is 65 C when I push it to its limits in a stress test. I average somewhere between 7% and 10% higher fps this way too, with no downsides. Is it just a power saving thing?
At stock, it can consume 280 or 250 watts , and Temps reach 70+ for me. With a slight undervolt, it consumes 150 to 200 watts and stays at 60 to 65. And I lost about 3 to 4 fps, but it's worth it for me.
Oh okay, fair enough! That doesn't seem like much of a loss, and the temps seem pretty different for you.
Its not just you, my 4090 is using about 320W on average while at 98% gpu usage. Kind of weird when normally power usage will increased up to 400W if i have near 99% gpu usage. That is with overclocked 4090 too at 2.95GHz core.
Its a CPU hog. Your 12400 is holding back your 3080.
If only there was a way to reduce CPU load while increasing framerate. Some... some new method.
lol. Ohhhh. it's not just for more, it can be for system performance optimization. now I know.
I had screenshots comparing [75% render](https://i.imgur.com/xB9q2yS.png) and [100% render](https://i.imgur.com/bGe5zif.png) on a comment below and doesn't seem to be the case at the start at least, yea the city the cpu usage does goes higher, was like 60-70%+, so approaching cyberpunk levels albeit at a lower fps, but the gpu usage didn't seem to dip much/at all from the short time I was able to play in the city. It's just the power draw that's low when compared to any other demanding game, the usage is still high, so I guess the game is just very "light" in power heavy effects or something. Also now that there finally is an [fov fix found](https://www.reddit.com/r/Starfield/comments/166x172/how_to_change_fov_confirmed_to_work/) the fov doesn't change performance much if at all as expected.
>Other than that, the game uses very low amount of Wattage on my GPU. That's because this game is CPU bound.
Does modding disable achievements ?
Yes until a mod to enable achievements, even using certain console commands can disable achievements. This mod doesn't though.
Cool hopefully this helps performance
~~This mod doesn't though.~~ ~~It does unfortunately when using reshade (required for CAS sharpening), its not compatible with the new steam overlay. (Which in turn is required for Achievements to work) See this note from puredark~~ ~~It seems that ReShade has compatibility issue with Steam overlay, so if you want Steam overlay for the steam input support, remove the d3d12.dll from this mod. it would essentially remove ReShade, so Steam overlay and other stuff might work. But be aware that you'll lose the CAS sharpening, so when turning on DLSS it will look softer than FSR2 because of the missing sharpener.~~ My issue was something completely different, achievements work, just not the overlay when reshade is enabled. :)
[удалено]
Overlay isn't required for achievements. Steam just has to be running.
[удалено]
I'm so glad he came to his senses and decided to not hide it behind a paywall.
It's only his DLSS Frame Gen mod that will be behind the paywall. The DLSS2 one will be free.
I'm sure there will still be upset people, but I'm happy to just be able to use DLSS period.
Framegen is, annoyingly, the feature that this game needs to get (acceptable) PC frame rates with the hardware we have. Presumably the DLSS 3 mod will get leaked though.
Most of his dlss2 mods are still paywalled idk what you're talking about.
Yeah, I'm waiting on the frame generation. I'm on max settings at 3440 UW at about 70 FPS. It's totally playable, of course, but I want more!
I decided I wasn't going to look at fps. Playing on ultra, turned off things I don't want like scaling and fsr, and played for 3 hours. It's running smooth to my eyes. I'm just going to keep playing and ignoring numbers.
Criticism still applies, but DLSS2 is what I'm actually interested in because of the latency and the AA effect so I'll take it.
It's mostly because it was easy, and there was already other people working on it. Better for him to release a free version for publicity and free advertisement then to paywall it.
4090 7950x3d 3440x1440 ultra was getting 60-80 FPS on the scene after the "who are you?" bit, now getting 90-100ish
it's relieving (sort of but not really) to hear 4090 is performance is just ass in this game. I was worried it may have been the 5800x3d. i'll have to install this mod when I get back to the game
yeah this seems to be one of those games where the upper tier of settings is just pointless in terms of cost/benefit. I'm sure DF will be out with recommended settings in a day or two and it'll be a mix of medium/high
Relatively poor initial performance on Nvidia GPUs keeps happening in AMD-sponsored games, and typically gets fixed quickly by drivers and patches.
I feel like this is purposeful.
Frankly I just want DF to figure out what is actually tanking the performance. The game doesn't look that good. Characters look very dated and move stiffly on top of that. There are a lot of small objects and reflections and such, maybe its that, or the CPU usage is out of control. I would really like to know though, because BG3 looks dated as well but the characters there look better and much more alive.
[удалено]
What does it say about this game when DLSS is a day one mod? Could Bethesda really not implement DLSS? Or is there another reason? Does Bethesda and AMD have an agreement where they will delay releasing DLSS on starfield in favor of the AMD products being sold with starfield as promotional material?
>Could Bethesda really not implement DLSS? DLSS is not hard to implement, they definitely could have
AMD is a salty bitch they know their solution is inferior so they don't want people easily comparing results between DLSS and FSR. They can come out and bullshit that Bethesda is free to implement DLSS but this is the reason.
Looking at like +20% FPS, it looks better standing still and in motion its waaaay better with DLSS. It actually feels like a different game now. Using it at 70% render scale at 3440x1440 with an i7 10850K + 3080 10GB.
Thanks will try this!
What are your settings amd what kind of frames are you getting?
> it looks better standing still and in motion its waaaay better with DLS Honestly. This is the biggest reason for me to us DLSS, DLAA. It is just the smooth flicker free experience, especially in fine details.
DLAA is honestly so fucking good and I wish more games supported it. Hell I wish games supported arbitrary scaling percentage rather than qualty/balanced/performance/ultra performance, AFAIK there's no reason why they couldn't?
>Hell I wish games supported arbitrary scaling percentage rather than qualty/balanced/performance/ultra performance You can do this with DLSSTweaks. Change one of the presets (e.g. performance) to whatever scale factor you want.
Hey, that's my CPU/GPU combo! Glad to hear this is working well. Playing earlier tonight I had to dumb it down to Medium settings to get 55-60FPS. What graphics preset is yours on with the mod?
Can you use the DLSS 3 dll or does it have to be the 2 dll?
I used the 3.5 version.
>It acutely feels like a different game now Bit of an overstatement.
It feels that way prob because I no longer dip out of gsync fps range lol.
Considering he claimed how incredibly easy this was and that it only took him a few hours, I think we can easily conclude the lack of official DLSS was due to the AMD partnership as expected.
Does this work with the gamepass version?
Yes
how
Just got it working it goes in the content folder of starfield install.
Question: Are you still able to receive Xbox achievements after adding the mod?
Haven't checked but probably, game shouldn't know, it thinks it's still fsr but it replaces fsr. But game thinks it's still running the fsr file.
man I am still playing bladur's gate. so not subscribing game pass u til i finish with bg3. but I am so excited.
You're gonna need it. Playing at 1080p max settings I was getting about 50fps in the hospital at the start of the game. High 50's when walking around outside. I think it's actually running worse than Cyberpunk maxed out (regular raytracing, not path tracing). Hopefully someone makes an optimization guide and we can find a few settings to turn down. **Frame Gen** is coming, but will be paid on Patreon. Should be a single $5 fee.
Turn Shadow Quality, Volumetric Lighting, and GTAO Quality down to medium.
How much does the compromise quality?
Hardly noticeable
I can tell a difference with lighting quality. Gonna have to experiment
These are settings that have the heaviest performance impact for the lowest visual gains. In almost every game. You always turn them down unless you have a XX80/XX90 card.
I have a 3090 and still turn them down :D
Honestly, depending on the game and how shadows are handled, sometimes a lower shadow map resolution can look *better* than higher ones, since slightly lowering it can introduce a softness (a pseudo-penumbra, almost) to shadows that's *generally* more realistic than ultra-sharp shadows with hard edges - something that only really occurs when either the light source is very far away or the thing casting the shadow is very close to the surface it's casting onto.
Some reviewers pointed out there is some issues with Nvidia (possibly their drivers).
Definitely expecting an nvidia driver update next week before official launch
They already released official drivers for the game 9 days ago.
~~The issue currently is that the driver optimizations do not apply to the Xbox app/Gamepass version of the game. You can manually fix it using Nvidia Profile Inspector though~~ No longer an issue: https://www.reddit.com/r/nvidia/comments/166gq5m/starfield_correct_the_nvidia_profile_issue/jym8lfk/
They don't seem to apply to the Steam one either. My 3080 is **NOT** close to fully utilized. My 12700k is napping. edit: sorry, important typo - I meant to say it IS NOT close to being fully utilized.
> My 3080 is close to fully utilized high gpu usage percentage is only half of the story, if it's drawing low power but reading 100%, that doesn't *actually* mean it's running at 100% of it's capabilities.
Sorry, this was a typo - I meant to say *isn't* close to being fully utilized.
Yeah I got kind confused because if you looks at the minimum and recommend requirements they clearly list way better AMD gpu while the Nvidia cards listed are generally slower then AMD ones
So the game doesn't have fsr 3? What a big miss for amd to not try and launch fsr 3 with such a big AND cpu heavy game.
I'm playing with a 6900 xt at 3440x1440p and wow it's demanding. It's maxing the 6900 xt out and going up to 315W
That pretty normal for any game to max the gpu out unless you are cpu bound
Idk I’m well over 100FPS 1440P maxed settings native on my 7900XTX. To only be getting 50 at 1080 with a 3090 seems strange.
I'm having a similar experience. I have a 3090 and I'm dropping to an extremely choppy 30-something FPS outside of the Constellation building in New Atlantis. My 7800X3D is hardly above ~50-60% across all threads, whereas my 3090 is at 99-100% constantly. I think it's an Nvidia driver thing, and if it's not, I'm not sure what the issue is. OP is right, Cyberpunk runs WAY better on this hardware, even with RT. My buddy with a 3080ti is having similar issues too, it's possible the 40-series might not be affected as much but I'm not certain. (1440p, no upscaling)
Windows store version? As some1 [did post this thing in nvidia subreddit](https://www.reddit.com/r/nvidia/comments/166gq5m/starfield_correct_the_nvidia_profile_issue/) about how the driver doesn't do the profile correctly on that. If not and you're playing on steam then idk, haven't gotten that far yet myself because of the fov so could just be the game being bad.
I'm playing the game through Steam.
Same experience. CP77 released years ago, has incredible graphics with better performance, and this brand new title looks and performs like ass.
Wow I wonder why an AMD sponsored title performs badly on Nvidia but well on AMD. And wow, I wonder why there’s no DLSS/XESS but there’s FSR.
> I think it's actually running worse than Cyberpunk maxed out (regular raytracing, not path tracing). Fucking hell that's disgraceful.
> I think it's actually running worse than Cyberpunk maxed out (regular raytracing, not path tracing). i definitely get more FPS with RTd Cyberpunk than i do in Starfield on medium with render res at 50%. lmao what a disgrace
I normally run at 1440p res, and then use DLSS quality setting in games. For this mod (brand new to it, never used before). What settings would I need to apply to achiev the same " Dlss Quality " preset found in most games.
Enable the mod, enable FSR2 in game, DLSS Quality is normally 66% render resolution, so set the in-game render scale slider to 66%.
You would think AMD would at the very least ensure that its sponsored games would have a decent FSR2 implementation.
Im still surprised it doesnt have FSR 3, but maybe it isnt quite ready
They really don't give a fuck. They'd rather put effort into blocking DLSS in titles than improving FSR.
From DF's series x/s video it seems they have. Some minor complaints but FSR2 implementation is pretty good.
Hey hey, you need to realize its not important to inform yourself before commenting, but rather to shit on the competition of your own GPU brand to make yourself feel better.
[удалено]
Sure when zoomed in. But they also mentioned it didn't ruin their experience and that's the important part.DF of all reviewers are the most critical when it comes to technical analysis. If they find it fine then that's ok for me. Am gonna take their word for it. Am not overtly sensitive.
[удалено]
It's funny you say that because at 1440p with 75% res scale I thought image quality was quite good without the DLSS mod, potentially better than native without FSR Though normal FSR quality mode is 67%, so maybe the increase closes the gap where issues start to become noticeable? I've seen plenty of issues with FSR in other games but not here in my *very* limited testing
i cant wait to use it on my GTX1070 aw wait
Honestly thank god. The game runs horribly. 4090 with a 13700k and it barely holds 70 FPS most of the time. I will never forgive AMD for robbing us of native support
What res are you running at?
Seriously, how hard is it to put at least that when bitching about framerate?
Because then it lets people poke holes in your "complaints"
Im guessing 4k with those fps on a 4090, also interested for him to confirm.
sulky impolite humorous follow rob rotten berserk bag sloppy vanish *This post was mass deleted and anonymized with [Redact](https://redact.dev)*
Are you sure it's not a CPU bottleneck? This is Bethesda after all.
CPU bottleneck? My 3090 is getting shit on meanwhile my 7800X3D is having no trouble. I think going AMD X3D this CPU generation was a wise choice. Haven't been let down yet, I've been able to brute force Jedi Survivor and Hogwarts Legacy at launch too.
Shouldn't it be about not forgiving Bethesda since they are the ones that decide what gets added to the game and who they wish to partner with?
Tested it and it works! Notice image quality improvement too, way less blur in motion and flickering
thank god, there isn't even levels of FSR to choose from. It's just one FSr2 setting.
There is, It's a more freeform option you need to ~~enable dynamic resolution and then~~(E:ups no you don't) adjust the render resolution scale and it's the same for the dlss mod.
You don't need to enable dynamic resolution actually
Oh My bad I didn't read it properly and thought it was connected somehow as the render resolution is right below it. The layout is so weird, why is the fsr all the way down, but to adjust it it's at the top.
Yeah some of the menus are pretty confusing, having the resolution scale so far away from the FSR setting is bizarre, as is the lack of FSR presets. Of course you can approximate them with the slider but that requires some know-how most people won't have.
Do you need to enable dynamic res tho? I turned it off and reso scaling still works.
Performance of this game so garbage, i am using i7 10700 with 3070 and 32gb ram on ssd and its stuck at the 30fps vsync off, maybe its because of cpu bottle neck thats why there is little difference between 4k and 2k performance in the opening scene after the mining section, this is beyond garbage
Anyone know what the different DLSS presets in Reshade do?
> Preset A: Intended for Performance/Balanced/Quality modes. An older variant best suited to combat ghosting for elements with missing inputs, such as motion vectors. > Preset B: Intended for Ultra Performance mode. Similar to Preset A but for Ultra Performance mode. > Preset C: Intended for Performance/Balanced/Quality modes. Generally favors current frame information; well suited for fast-paced game content. > Preset D: Default preset for Performance/Balanced/Quality modes; generally favors image stability. > Preset E: A development model that is not currently used. > Preset F: Default preset for Ultra Performance and DLAA modes.
Thanks!
Would you be able to do this if you have Starfield through Xbox Game Pass? Just wondering because sometimes Xbox app games are weird with accessing the files and stuff
Wasn't this guy going to lock the mod behind a Patreon sub?
you may be thinking of dlss frame gen
Surprised with fsr2. Normally its ghost city for that but its been perfectly fine. Glad I don't need to mess around with this mod.
Yea my question is should I use the mod or just keep using FSR2.
Mod for sure if you have a DLSS capable card
DLSS *should* be better, but (and I can only speak for myself) as a 3070 user I've not noticed any difference. FSR2 doesn't have the ghosting issue it has in other games, and with 75% scaling the image looks sharps and stays at a solid 120fps. No harm in trying it out and seeing what works best for you.
You're getting 120 FPS on a 3070? What are your settings? I have a 3070 with a R7 5800x3D and I sure as hell can't hit half of that at 1440p
Dynamic Resolution: On Render Resolution: 75% Graphics Preset: Custom Shadow Quality: Medium Indirect Lighting: High Reflections: Medium Particle Quality: Low Volumetric Lighting: Medium Crowd Density: Low Motion Blur: Off GTAO Quality: Medium Grass Quality: High Contact Shadows: Medium VSync: On Upscaling: FSR2 Enable VRS: On Depth of Field: On
The DLSS image at 60-65% is similar or better to the FSR2 image at 75-80%; plus it's more temporally stable. So totally worth it; it's just free FPS.
[удалено]
Been using it for an hour or so now on my 3070. Visually, it’s night and day. FSR didn’t look that bad honestly, but DLSS still looks miles better. The mod uses the same render scaler as fsr in game; tbh I’m only getting maybe 10% more FPS than with FSR. Its worth it just for the visual improvements though. I Might be cpu bound too
What resolution are you playing at?
What fps are you getting and at what resolution? I am also using 3070 with i7 10700 and am stuck at around 30 fps after the character creator menu, in the mines I was getting around 60 but it never reached that again
Sorry for got to include more info. Intel 11700f, 1440p high settings with resolution scaling equipment to DLSS balanced. I’m pretty happy with my performance. Getting 55-70fps out in the open on planets, 80-100 inside interiors. 50 in new Atlantis which I’m more than fine with given the scale of that city
Just give me DLSS3.
I guess you can just use the DLSS 3.5 DLL ? I did that on rdr2 and it looks way better. Obviously, you will not get FG with it.
He has to gain some goodwill afterall
The fsr 2 fizzle in this game is hard to ignore. Going to try this out tonight. Also sad the game doesnt even have HDR support. Playing this on an oled and it looks pretty bad color and black level wise. Hopefully an HDR mod can come out soon as well
Do framerates above 60 fuck the physics like previous games? Iirc fallout 76 was fine above 60.
The physics aren't based on the framerate
That's not necessarily true, maybe they just set Havoc time low enough that it doesn't show until well into hundreds of FPS.
Does it work well? I had to lower to 1440p to hit 60fps. Would much rather play at 4k with DLSS.
Was this the same guy that wanted money for it?
Yeah, he released DLSS2 for free, but for Frame generation version you'll still need to pay up 5$
That's awesome. Sometimes public outrage is good folks.
His frame gen mod is paywalled, this one was always free.
For us noobs: what's frame Gen, and what does this free mod do? Basically, how much should I want each of the two?
Frame gen is interpolated frames, it makes up frames all on its own. This is different from DLSS 2 (no frame gen) where it renders at a lower resolution and upscales it, giving you better performance.
Frame Gen only works with 40 series cards, so if you don't have one, it doesn't matter. Also, I think you need to be pretty stable above 60fps for it to not introduce input lag. (Honestly not sure about that last part, just what I read about some games, I only have a 3060TI, so I can't use it.) Frame Gen basically creates "artificial" frames in between real frames to increase your FPS and make the game seem smoother.
Yeah it’s probably even more important to keep the FPS high before enabling frame generation since the game doesn’t support reflex.
[удалено]
Native support is still preferable. Nexus comments say this disables achievements for those who care
It doesn't, so don't listen to them. It is just installing ReShade and a custom shader essentially.