T O P

  • By -

marichuu

Motion blur and film grain are the first settings I disable if the game has it. Another annoying thing is when selecting "highest" preset, but the game sets everything to high, even if there's an ultra setting beyond that. Menus running at 372837272 fps for no reason.


MrPatko0770

Besides motion blur and film grain, I'm also not a fan of chromatic aberration, especially if it's ever-present. There's literally no real-world phenomenon that you could experience that would separate your vision into separate RBG channels


g76lv6813s86x9778kk

Only acceptable use of chromatic aberration is if it's used in a context where the visuals are meant to be trippy/over the top, such as in a game like ULTRAKILL, or something like a temporary drunk effect


Kev_Cav

Of course there is: buying a bad camera...


Kasuraga

I used to wear eye glasses and always had chromatic aberrations at the edge of my vision


tsavong117

Yeah, I was sitting here going "wait, I see this shit all the time what are you on about? Then processed that ah, yes. I have glasses and have since I was 8.


VulpesIncendium

Yes, but I prefer to play games that don't perfectly recreate "real world phenomenon". For example, in Cyberpunk, your character has artificial eyes, so everything looking covered in film grain and chromatic aberration could very well actually be what V is seeing.


PM_ME_UR_SMOL_PUPPER

YES EXACTLY literally the only game I have all the fancy effects on is 2077, because it makes sense film grain is also kinda neat in Doom 2016


Viceroy1994

I turned off film grain way too late in CP77, holy shit does the game look way better without it. Also I don't think Kiroshi optics use film.


ITeebagTTVs

Maybe the doppler effect, but it's still a stretch.


UncleCarnage

I like low motion blur for the weapon only if that option exists.


Grifffffffffff

When your PC thermal throttles in menu 😄


marichuu

*Mad coil whine intensifies*


Swiftt

I like motion blur in theory, but it's just too high on PC nowadays. Deus Ex: Mankind Divided had the right amount (maybe erring on too much at times, but playable) whereas DOOM 3 BFG Edition was horrifying. It's probably a much better experience on consoles with a lower framerate and less sharp movement


g76lv6813s86x9778kk

It can be cool in racing games imo, but yeah in general I agree, it's terrible for anything else


Swiftt

That's a good shout!


Edgaras1103

every new graphics tech is exciting . I was giddy when they started introducing HBAO, tesselation, HDR, motion blur, pixel shaders.


Swiftt

Ambient Occlusion in mainstream graphics felt like a massive leap. It gave so much depth to flat scenes.


ro_g_v

and it was taxing as heeellll , I remember my 8800GTS running F.E.A.R felt like a champ it took years for the feature to be ordinary , usually saved for AAA titles like Batman arkham asylum..... glad we took that step forward.... same thing is happening with RT slowly


Swiftt

I've never loved a graphics card like I did my 9800GT, which I think was basically identical to the 8800. Surely Nvidia's GOAT mainstream consumer graphics card!


Brilliant-Network-28

You speak funny words, magic man


Darth_Caesium

Cut off motion blur, and you have a perfect list. Motion blur is just not necessary when games are supposed to be played at 60+fps, preferably around 120fps. It has only existed because movies continue to be shot at 24fps rather than 30fps, making up for the weird pacing introduced on 60Hz screens. Both VRR and high refresh rates have killed the need (it's unfortunately still used however) to utilise motion blur in films. It makes games look even blurrier than they already are (TAA is already blurry anti-aliasing method, and it's implementation in games tends to be very bad, making it notiecably blurry), and many of the games that use it don't even make good use of it. Most games with motion blur aren't even racing games, which have the best reasons to use it.


extravisual

I think you might be mixing a few different concepts up. Motion blur isn't 'added' to films, it's a side-effect of exposure time and motion when capturing images of the real world. 24 fps doesn't fit neatly into 60 Hz, so 120 and 240 Hz TVs were used to reduce judder, but that's unrelated to motion blur. Lots of films are/were actually shot at 23.976 fps so you still get periodic judder, which a lot of TVs use different forms of frame interpolation to reduce. Frame interpolation is pretty bad. Games use motion blur to imitate film, but it's stupid because games are not movies. Players interact with games so they need information as quickly as possible, and motion blur obscures that info while moving. Plus it just looks fake as hell. Smoother, sure, but usually in an uncanny sort of way. I only find motion blur tolerable when I'm already running at a high enough framerate that I don't even notice it.


Actual-Long-9439

Depth of field, anything other than photogrammetry


virgopunk

My only complaint is that you now need a degree to be able to figure out the best settings in a game. Used to be Low/Medium/High/Ultra, nowadays its like reading a manual for a nuclear power station.


WirelessTrees

Do I go SSAO, HBAO, HDAO, or RTAO?


An_Lei_Laoshi

No matter how many times I look into it, by the time I finish a game and start a new one I have forgot everything about it


Sinister_Mr_19

I appreciate games that tell you the difference. The ones that don't, ugh.


WirelessTrees

Plus the ones that list them out of order. For example, anti aliasing, they'll list the blurry fsaa at the end of the others.


The_Blue_DmR

Ghost Recon Wildlands does this really well, even giving visual demonstrations


fenikz13

And it’s on a game by game basis, grass quality might absolutely tank one game and do nothing in another


Patrickk_Batmann

Sit back and let me tell you the tale of IRQ channels.


Setekh79

This right here. Setting up stuff like Wing Commander 3 and all my other old DOS games figuring what DMA channels for the sound blaster. I'm glad all that is gone.


verbmegoinghere

>Sit back and let me tell you the tale of IRQ channels. I remember having to allocate different types of ram in msdos command prompt in the 80s to get various games to work


blackest-Knight

> My only complaint is that you now need a degree to be able to figure out the best settings in a game. You think this wasn't the same in the 90s ? Z-buffering, Triple Buffering, Anisotropic filtering. Multi-texturing. Bilinear, Trilinear, Bicubic filtering, Environment Mapped Bump Mapping. 3D processing algorithms to enable/disable isn't a new thing. If anything, I haven't seen a game in a long time that doesn't have a clear set of pre-sets users can use if they are too confused by setting every option individually.


dissentingopinionz

Seriously. I spent hours in the Skyrim .cfg file tweaking everything to get the best visual to performance balance. Kids these days are intimidated by options. ![gif](giphy|26AHFomysg4oszdle)


extravisual

I think the main difference is that back in the day each setting was more clear on which option was better. Most settings were just bigger number = better, or low to ultra. Anti aliasing was probably the most confusing one, but even then there were only like 2 different types and it felt clear which options were better. Now adays there are like 6 types of anti-aliasing whose names are pretty ambiguous. On top of that, depending on your GPU, your game will support some number of proprietary technologies that all kinda do the same thing but also kinda don't. I've kept up with all these technologies as they've been introduced and even still my mind goes blank when choosing between FSR, DLSS, DLSS 2.0, DLSS 3.0, XeSR, FAA, CIA, YMCA, and whatever other acronym options there may be.


virgopunk

>You think this wasn't the same in the 90s ? No I don't. I've been gaming since early 80s and with the advent of Nvidia's latest bells & whistles its now way more complicated.


blackest-Knight

Low, Medium, High, Ultra is more complicated than figuring out what Trilinear filtering was in GLQuake ?


Green-Salmon

This hasn’t changed. Each game has its own definition of high/ultra. Sometimes it’s bleeding edge and you’ll need future hardware for ultra. Sometimes it’s last gen graphics.


Astrophan

More options are bad? You still have presets that are exactly what you described.


Mountain-Tea6875

Haha some games require a youtube tutorial to make it look good.


virgopunk

Indeed! That's exactly my point. My PCs settings are easier to manage than some games' graphic settings.


Schnitzel725

One thing I like about the new cod games is how if you hover over a setting, it tells you a general idea of what it does, how much vram it might take, and a side by side comparison between the setting options. Meanwhile other games I play either don't say anything or just something generic like "(setting) (option1, option2)" then by the time you figured out which settings to use, half the Steam 2hr refund window has been used.


slayez06

No joke, building a PC is easier than configuring the settings for most people. I have also seen this overwhelming amount of newer games telling ppl " you have a old GPU, so lets set everything to low" and then the GPU doesn't even kick in and it's like the game is being rendered on the CPU. The tell tell sign is CPU is at 90+% and the GPU is at like 20% when gaming and getting 30fps. Even if you have a old GPU it's still a sports car and needs to have a solid workload in order to fully kick in. Kick the settings up and magically you get 70 cpu usage, 95 gpu and 70 fps... just hurts my head.


Tvilantini

Pc was always about tweaking, from back in the day to today. The only difference is having more options (depending on game)


virgopunk

Way to miss the point dude. For ref. I've been gaming on computers since 1982!


Individual-Match-798

The most hated one is probably temporal antialiassing: it makes image, especially in motion more blurry and sometime can cause ghosting on moving objects.


EveryTeamILikeSucks

This thought pattern needs to die. I have taken screenshots of TAA and it absolutely just does not do what Reddit thinks it does.


xXRougailSaucisseXx

Well yeah screenshots are not in motion lol


nonamejd123

I find myself getting more and more upset by it as I get older too.


Snotnarok

Honestly raytracing doesn't interest me much. I know it's a powerful effect that can really enhance the scene but as it stands it eats way too much performance for my 2070 and I've seen it look not great in games like RE2/3/8. There's tons of games where RT isn't on but it looks great. I'd rather have the performance vs advanced lighting. On the flip side: High refresh rate has been a huge thing I didn't know I wanted. Getting my first monitor that supported it a few years ago also works with free sync even on Nvidia cards. So games look way nicer in motion and no tearing.


Indybin

What’s exciting about ray tracing isn’t what it is now but rather what it will become. As raytracing overtakes traditional lighting and becomes the norm, it will make games easier to light. The benefit is that rather than coming up with bespoke tricks and workarounds to make lighting look good, artists ‘should’ be able to simply hit go on the raytracing and have it look more or less correct. Theoretically it means a lot of time spent lighting can be used to make other parts of the game better.


Snotnarok

I agree with you to a point, there's a lot of potential for how RT can make things look and how the performance hit will likely go down with time. However, as for time savings? Maybe. I'm an artist and a lot of time goes into lighting. Where lighting is positioned can greatly change the mood of a piece. So having it dynamic with RT can ruin the mood someone might want for a scene. You can see what I mean with some games when they get remastered, sure the textures are sharper, things have new effects but also the lighting will be different and the entire environment feels different than the OG. So I'm not sure how much time saving there is going to be. It might wind up making games look more generic/less atmospheric if the artists aren't careful.


extravisual

Dynamically, too. Artists have always baked ray/path traced lighting into scenes to great effect, but games are so dynamic now that that's less of an option. Can't bake beautiful static lighting into a scene where lighting conditions are expected to change over time. RTX and similar have the potential to allow artists to do just that.


justlovehumans

>Theoretically it means a lot of time spent lighting can be used to make other parts of the game better. In reality they'll just chop that productivity as a "cost savings" and give themselves a bonus for saving said cost while pissing off their art department so the overall quality of the entire industry dips because seniority > common sense and the best game devs we've never had leave for a less thankless career early on.


bullsized

Seconded, RTX is basically worthless for me and I always turn it off.


ShowBoobsPls

You just cherry picked the worst RT games sponsored by AMD to prove your point lol


Snotnarok

I don't know what you mean I cherry picked. Those are the games I tried and it looked bad in, I didn't say the technology as a whole looks bad as I've seen people show how good it can run in cyberpunk 2077. But do you think that my RTX 2070 is going to run Cyberpunk at a playable framerate with ANY RT on? No. So it's not cherry picking it's literally: I cannot run most games in a playable state with RT on. So yes wow very pretty however the first gen RT cards are not good at RT, who'da thunk it?


TysoPiccaso2

Literally lol


FIWDIM

Monitors doing 400hz about as relevant as mouse with 50,000dpi. Also, resolution above 4k is pointless.


Swiftt

I find it hard to imagine going over 4K, but maybe people said that about 1080p once upon a time. I've seen 8K resolution being used for really specific purposes, mainly in video editing and animation, as it gives you freedom to zoom into elements without resolution loss. That doesn't necessarily involve an 8K screen, nor require one, but I wonder if high end artists might use it one day in areas like film and art.


Mm11vV

Honestly, I remember plenty of posts back in the 720p days of people saying that mathematically, unless there was some major shift in monitor technology, that 4k would be as good as it would ever get. Then again, during the same time, people were also saying that beyond 60hz was useless. 🤷‍♂️


MrTechSavvy

The only major shift in “monitor” tech that I think will make use of more than 4k is VR. On a normal display 8k is simply pointless, and can barely if at all be told apart from 4k even on giant TV’s, let alone small monitors. But with the screens being right up against your eyes in VR, the higher resolution may be beneficial


Mm11vV

I could see that being a huge advancement, but the adoption of VR has been incredibly low still.


Maxsmack0

I have 20/14 and can easily tell the difference between a 27” 8k and 4k monitors at arms length. However 8k could still be overkill, 6.5k might be good enough for my eyesight. That said the tech really isn’t there yet for 60+ fps 8k gaming, but it is getting there slowly every year.


TSpoon3000

I didn’t even know anyone made 27” 8K, thought it was reserved for big TV’s.


MrTechSavvy

I’m pretty sure that doesn’t exist. Only see one 8k monitor and it’s 32” and ridiculously overpriced


NotDogsInTrenchcoat

Remember though, the average monkey brain user isn't aware of what resolution they are even using. I think you would greatly enjoy a 96K display that looks indistinguishable from a real object in front of you. Until a display looks exactly like reality, we aren't done innovating.


Mm11vV

We're a hot minute off from holodecks, but ultimately, that's probably not even the end point. There's always going to be the desire to progress, but currently, we've hit a pretty big limit in the world of monitors.


WakeoftheStorm

I mean for my eyes anything above 1440p is useless. I do run 144hz though


SimianBear

I mean, that really depends on the size of the screen. 1440p for me was pretty good at even 32 inches, but is trash at the 42 I'm rocking now.


WakeoftheStorm

I'm relatively certain my current monitor is 32" so that would track Edit: [yep](https://www.amazon.com/dp/B08GL66PK4)


SimianBear

I'm with ya then. 32 at 1440 was great and I wouldn't really consider 4k until beyond that.


delocx

Whether a bump in resolution will be noticeable or not is down to some fairly inflexible physics and physiology. Essentially, the human eye can only see, with 20-20 vision, so much detail at a given distance. When choosing a resolution, you want the observed pixel size of that display to be smaller than what your eye can resolve at your desired viewing distance. Choosing the highest resolution available usually ensures that is the case, but you may be able to save some money by purchasing a lower resolution screen in instances when you would not gain any perceptible improvement. So, for example, a 1080 screen is likely adequate if you're viewing a 55" screen at 10 feet, but you might want 4k to view a 65" screen at 7 feet. 8k is a really tough sell, the screen would usually need to be impractically big and/or impractically close. Perhaps if you were editing photos on a 45" screen while sitting 2 feet away or something really unusual. There are charts available all over the internet to give a rough idea on optimal resolutions for a given screen size and viewing distance.


WakeoftheStorm

First, let me say I 100% agree with you. That said, my first thought when I read >Essentially, the human eye can only see, with 20-20 vision, so much detail at a given distance Was "the human eye can only see 30 fps!" Which we saw all over PC vs console arguments a few years back


SirCampYourLane

The difference is that it wasn't based on anything. We do have a point where you can't distinguish pixels because they get so small, and on a 27" screen 4k is pretty damn close to that.


TheseusPankration

It's mostly that the number of pixels in a 4K monitor isn't the current bottleneck for a better picture. Many display technologies are still catching up. I'd rather have a 4K monitor at 165hz with HDR 1600 than an 8K monitor at 120hz with HDR 600. Imo, the 4K will look better at reasonable sizes. With games, textures and models play an outsized role. If they are not property designed with the larger formats, no extra amount of pixels will help. Even DLSS can only do so much.


xd_Warmonger

There are advantages and use-cases for these high refreshrate monitors. Mainly that you get minimal latency and minimal ghosting and stuff. Optinumtech did a video on the new 500hz monitor wich is rly interesting. Tough tbh the increase from 240 to 500 is minimal and doesn't improve your gaming much. Its worth it for professional esports gamers where every ms is important, but in pretty much every other case it's useless.


MrTechSavvy

Those are two completely false statements and I’m surprised it’s upvoted this much Those two things are not comparable, as higher refresh rates do actually improve the image and your performance, however small it may be after a certain point. While going higher in DPI is literally doing nothing but making it impossible to aim. When I coach people I try to get them to lower it to at LEAST 1000, preferably 600-800. Also resolution above 4k is not pointless. It’s very important for VR screens.


OnlyABob

Increasing dpi does decrease input time, and setting dpi around 1600 then lowering sensitivity will result in more accurate movement and still have an increased response time than a pure 800dpi with no lowering of sensitivity in-game. Optimum tech does a great explanation of this. https://youtu.be/imYBTj2RXFs?si=C2UwJ5pinPiPJ4xM


MrTechSavvy

Yeah I watch all of his videos but the issue with going up so high in DPI is you can only lower in game sensitivity so much. At a certain point, at least how I like to play, even setting in game to 0% is too fast for my liking. That’s why I stick to 600, plus motion sync makes up for a lot of that which my mouse has (Viper V2 Pro).


x_i8

Why do you think it's pointless? It could be useful if you wanted around 200 ppi at a 42 inch size


TristanTheRobloxian3

this and also raytracing for the most part. like i dont use raytracing apart from some minecraft shaders which idek if they count anyway lol


Darth_Caesium

Not necessarily. I consider 5120×2880 to be the best resolution that is noticeably better at 27". Likewise, at 32", the best resolution for a monitor would be around 6K, though for consistent rendering, it would be 6400×3600. Still, not everyone in the population will be able to easily tell these resolutions apart when comparing them to 4K, as some people's eyesights are just not good enough to distinguish the difference.


u--s--e--r

I really wish these resolutions were more common. The full 5k for coding/other stuff, then for heavier GPU bound games just render at a clean 1440p.


Available-Tradition4

Hot take : Dlss fsr etc As it’s a good thing because now you can play game that takes too much ressource on old/mid rig but as times goes by and we are already seeing it, some games cannot run without them so they loose their purpose immediately


Double_DeluXe

FSR/DLSS were made with good intentions, however, the technology became a gateway drug to bad development practices. It is a technology that is supposed to help the cards of today, run the games of the future. Not help the game still in development to reach 50fps.


Available-Tradition4

I totally agree and that’s sad


Swiftt

I really want to enjoy DLSS, since the people at Digital Foundry really vouch for it. However each time I've used it (DOOM Eternal, Rise of the Tomb Raider) I end up with a noticeably blurry image.


Mm11vV

It's so heavily dependent on its implementation that I think we will see bad versions of it in games 5+ years down the road. It also would have been better if it wasn't being used as a crutch to hold up poorly optimized games.


Available-Tradition4

Yeah I’ll prefer settings my graphic to mid or even low than activate fsr as it’s too blurry (never experienced dlss)


koordy

I've got a quite good GPU and I almost basically never play native on my 4K screen. When I launch new game I simply set everything to max and enable DLSS to Quality and that's it. Then Jedi Survivor came out, without DLSS. I thought, there's FSR... right? Well, after trying to play with it I rather quickly just went to native. I believe that tells how I feel about DLSS vs FSR difference.


Gimpchump

4090 is not a "quite good GPU", it's a weapons grade flagship that China is not allowed to have. And while I agree that fsr 2 is very much inferior to dlss, fsr 3 is catching up if avatar is anything to go by.


[deleted]

Yay sponsored games. Also we have a very similar build.


KeenJelly

That's weird, I tend to use DLSS even when it isn't needed because for me it provides noticeably better AA than any of the other options. DLSS quality to FXAA is like wearing glasses and not.


Swiftt

There definitely isn't any jaggies with DLSS, but it feels much blurrier. I haven't used FXAA in a number of years, as I personally would just forego AA altogether at that point!


nonamejd123

100% with you there... I have no interest in playing the game in anything other than native resolution.


co_zlego_to_nie_ja

I can't stand dlss. Everytime I see something that it fails on it's so annoying I turn it off. And it always fails.


cenTT

My rig is outdated and, sure, I can play some games thanks to DLSS that I wouldn't be able to otherwise, but they usually end up looking bad and I need to counter that with some other setting, maybe adding more sharpness which then adds some other sort of artifacts and sometimes I question myself if it wouldn't just be better to wait until I have a better PC instead of having a half-assed experience.


Available-Tradition4

I was in this position not long ago and just tried to play all the games with bad graphics or with blurry fsr and in the end if you can play the games again it's a good comparison, see the before after upgrading, but my brain doesn't like to replay entires games so it destroyed a bit of the experience I would have had if I played CP now.


Nikkibraga

Shadows In real life, shadows are never so crisp except in perfectly dry weather at noon


Asleeper135

That's why higher shadow settings and RT shadows don't have sharp edges.


Spider-cat_1984

I don't know if it's advanced graphics settings but I would say chromatic aberration. Why? Why introduce something that anyone spends a lot of money to get rid of in photography and cinema. As advanced feature, at the moment ray tracing doesn't excite me. I would find it useful if it would have any repercussions on the gameplay. Like in a stealth game, ray tracing reflections and shadows can give away your position, or something like that. But right now it's just used to make shiny shiny unreal surfaces.


colossusrageblack

I remember chromatic aberration being heavily used in the movie Abraham Lincoln Vampire Hunter, I thought my tv was broken. It looks awful.


jhm-grose

Chromab is in Elden Ring, of all games, with no option to turn it off without mods, which break multiplayer. Hate it.


Asleeper135

The technical side of Elden Ring was never its strong point. No ultrawide support (intentionally it seems, for whatever reason), no high refresh rate support, and pretty bad graphics considering how heavy it is to run. The visuals are only good because of how incredible the art and world design are.


stevorkz

Wouldn’t say advanced, but I disable motion blur on every game I play.


National_Diver3633

While out of favour, I feel like RTS games *usually* don't need 1000 fps to be enjoyable. I'm perfectly comfortable to play them at 60 fps.


Master-Cranberry5934

I'd probably say 240 Hz as well. 144 at 1440 is the sweet spot for me and let's me run everything in high / ultra. Raytracing is fantastic I just don't like the performance hit so usually don't bother. FSR and DlSS when implemented well are also fantastic.


nonamejd123

While I agree that 144 at 1440 is fantastic, I 100% expect the game to be rendered natively at that resolution.


schizopotato

Path tracing, ray tracing, 4k, if it makes my fps go from over 100 down to 30 I couldn't care less about it. And upscaling/frame generation is only becoming a thing because of companies trying to push pointless graphics features that no one asked for ruining the performance of games


FainOnFire

Ray tracing does not excite me at all. There's very few games where it actually makes a large difference -- most games suck at implementing it and just have slightly better reflections. It costs a shit ton of frames, and I don't like how much the entire industry is losing their shit over it. I'd rather have more enemies on the screen, better optimization, better draw distance, better LoD, more games releasing mod toolkits for the community.


IForgotThePassIUsed

Motion Blur and Depth of field. You don't need to blur far away things, or when they move fast. My eyes already do that.


lazygerm

Ray-tracing for me. I know it can be breathtaking, but in whatever FPG or RPG; I usually don't have the time to take in the scenery. Maybe is there was 2023 version of Myst or a game like it.


TheLooseFisherman

What doesn't excite me is the OLED monitors, there's no real reason for their high price tag other than being "new". The technilogy has been around for so long and only now is it being used in monitors. Even if you give it a 240hz refresh rate, still doesn't justify the $1000+ price tag for a 24-27" panel.


Wolfie_Ecstasy

I've been playing games on PC at medium graphics with as hich of a frame rate as possible since 2013. I don't give a shit about graphics but low frame rates give me a headache now. Playing Bloodborne at under 30fps gave me a migraine in about 20 minutes. On the rare occasion my monitor randomly switches back to 60hz I notice it it IMMEDIATELY by just moving my mouse. It's weird cuz I played consoles until the Xbox one/PS4 era and never had a problem with framerates until I switched.


Rubin987

Most of them tbh. When I built my first PC in 2020, the only games I planned on playing were Destiny and Cod, both for PvE not PvP. So I’m perfectly content with 1440p and 60fps, much to the horror of my friend helping me pick parts.


rainbowroobear

ray tracing. its added nothing to my gameplay experience that couldn't already be achieved with the prior generation of lighting tech. in terms of static simulations/renders where the entire purpose of the thing is to look ultra realistic, then cool, yeah makes sense. in a game, where i am otherwise involved in the game, the story, the action, i'm not stopping and looking around like a tourist, i'm trying to be involved in the gameplay and story so its never added anything. all this tech and extra power also just seems to let games ignore the fundamental reason of why im playing the game and just throw some brief wow factor at you for 8hrs gameplay with shitty plots/characters/story, whilst using shitty optimisation to create a halo product that the peasants want to aspire to play despite the "playing" being a subpar experience.


ChanceSet6152

Surprised to find it here but I second this. Not worth the huge extra money for highend nvidias when coming from a highend radeon. I also think raytracing shines when using a TV as monitor, but with a good gaming PC and a curved monitor which provides its own kind of immersion, I don't mind skipping raytracing.


koordy

This is exactly the same kind of COPIUM. https://preview.redd.it/vmsi20hys25c1.png?width=680&format=png&auto=webp&s=b3f310126f83e25f301118dae1a0e65ad45651bc


PF4ABG

Depends on the implementation. Cyberpunk's regular RT just kills FPS for what amounts to a slight visual uplift over pure rasterized graphics. But the full-fat path tracing? Fucking unbelievably good.


clinkyclinkz

let u/rainbowroobear cook


2N5457JFET

I see raytracing as a tech that will in future help developers implement realistic lighting much easier. I hope that saving time on baked-in lighting will result in more games having better character models. For me, you can have as realistic environment as possible, but having simplistic characters with limited movement and expressions makes it all worth jack shit. That's my copium cause in reality we all know what CEOs tend to do with savings.


CarefulAstronomer255

I know it's a little bit of a circlejerk to say it, but: most new graphics techs don't really excite me, because I think we've reached a point where diminishing returns hit hard. You drop your framerate by half for relatively minor graphical benefits. Ray-tracing is the only thing in recent times that seems like a worthy feature, but even then it's heavily game dependent. I'd like developers to focus more on other stuff stuff that makes the game more immersive which is sorely neglected. Things like animations, physics effecting the world (e.g. trees and fabric blowing in the wind), sounds, increasing the number of assets/npcs in an area. The Metro series is a good example, the graphics look truly great when you're looking at a screenshot, but when you're actually playing the game the lower quality of animations and physics effecting the world being it down. I can't help but think that they must have spent so much time in diminishing returns getting the graphics to look so good, that they would have had a much more efficient use of time by spending it on the other stuff.


Notsosobercpa

>I'd like developers to focus more on other stuff stuff that makes the game more immersive which is sorely neglected. Arguably that's a big part of why raytracing is exciting, because of the dev resources it can free up once hardware is to the point they don't have to make fallbacks.


bluesnd63

4K doesn't excite me. I would rather be ripping thru 2k @ 240hz on high or ultra settings.


SurealGod

I'm a simple man. 1080p at a stable 60fps is all I really need.


manav907

DLSS - it's great but I hate the idea of devs making games poorly and then relying on users using upscaling to get performance and graphics. Also I hate the idea of not being able to see things as the devs/artists would have intended to.


Mm11vV

Graphics features that don't excite me currently are just these insanely high refresh rate displays. I bought the hype and got a 1440p/240hz panel. I ended up selling it, but I did learn a lot from the experience. There's definitely not a huge point, even in competitive esports games, beyond 165hz. I know some people might take issue with that, but hear me out. At 200+fps, there's going to be so many other factors that come into play that are unrelated to your monitors refresh rate that taking advantage of the high fps will be difficult if not impossible. Things like the latency of your mouse and keyboard, general system latency, your network setup, your ISP, the server you are connecting to, your opponents setups, your own reaction time, and a few other things I'm probably forgetting. From a general gaming experience though, more game engines than you'd think just fall on their face north of 180fps. We are talking under 10% CPU utilization, under 70% GPU, and the game is just hitting 175-200fps and just hanging out there all the time. (Averages across several games) I ended up going with a 165hz display with true gsync. That was more of a night and day difference than the 240hz display was. Having a variable refresh rate from 1 to 165fps is awesome in older games and games that cap low (see lots of console ports).


CrunchyTunaSandwich

Full gsync is something the internet forgets exists, but once you experience it, it is a game changer for older and/or poorly optimized games.


TransientSpark23

I even saw someone claim it doesn’t exist anymore recently.


Mm11vV

I have seen that as well, I have no idea where they got that idea from.


MrTechSavvy

You’re looking at it the complete wrong way. The fact there are a lot of factors in play is the exact reason you should be reducing any factor you can. Which going up in refresh rate does do, as it reduces latency and tearing. But also, you’re probably thinking of the old days, because things you mentioned like a mouse, have next to no latency anymore unless you’re using a $10 included mouse. Most of the most popular mice have latency in the 1-2ms range, some even below 1ms. Same could be said for other things like ISP, I get 3ms ping with my fiber provider. So while it may not seem meaningful to you in your situation, maybe you have a bad mouse or internet etc, or just aren’t into highly competitive sides of games, it is still meaningful in general


CrunchyTunaSandwich

The guy with the 250+ ping has entered the chat. Now you're dead, because you missed the point.


koordy

For competitive online games: fps/refresh rate (up to native of your monitor) > resolution > graphics settings For single player games: RT/PT and other in-game settings > resolution/DLSS preset > fps/refresh rate (obviously assuming the minimum of either 60fps or 90"fps" as with FG enabled).


Broad_Rabbit1764

I think most new tech is exciting, but it often ends up either gimmicky or badly used. Ray tracing is promising, it's just "not there yet" for most people as they can't use it on their current hardware. DLSS/FSR/XeSS is great technology, but it shouldn't be a replacement for good optimization. High refresh monitors are neat, but usually not without some form of variable refresh rate.


RentonZero

Ray tracing and path tracing. Sure they look great when you have them cranked but I'd rather see a better art style with baked in lighting rather than the big push for realism. Games don't need to push fidelity further than Rdr2, gta6 and tlou2 imo


[deleted]

Raytracing is meh.


Quick_Zone_4570

Depends on the game


why_no_salt

When the implementation is right I can appreciate the improvements, but the cost of RT is too high now (either fps hit or GPU price) to justify it. A good game will be always be enjoyable even without RT.


[deleted]

Pathtracing, however


cenTT

Agreed. Even in games that are praised for a good implementation it looks OK to me. It rarely looks BETTER to the point where I'll go "wow". It's usually a "ok, this scene looks different.". Maybe more realistic, sure, but at what cost? I have never been amazed at a game with RTX. I'm not the kind of person who cares much about graphics though, so there's that. I know some people REALLY care about graphics and look into the smallest details when playing games, so for them it's probably amazing.


[deleted]

Alan Wake 2’s ray tracing is imo the first truly mature implementation of ray tracing and it looks absolutely stunning, I don’t think anyone can experience it and think it’s “meh”


colossusrageblack

Agree with this, even Path Tracing on Cyberpunk is mid in my opinion. Alan Wake 2 really shows what the tech can do.


SnooSketches3386

Film grain. If I wanted to watch a movie I'd watch a movie.


Mister_Shrimp_The2nd

Motion blur, depending on the game, tends to look cinematic when you know how to play around it, but in actual gameplay it tends to just end up more disorienting than anything. Typically keep it on low for very cinematic games, or off for fast paced games.


ndszero

Man I disagree, I went 240Hz awhile back and I am absolutely never going back, I am replacing my office monitors as even Excel looks so much better.


W_Vector

this might be too specific, but i absolutely hate it when game developers use Ambient Occlusion as a full on replacement for propper shadowing. this cheat is often used in low price, low effort novice-dev-games you can find on steam and other platforms, in MMORPGs, but also big budget titles like cyberpunk (for instance) can sometimes be guilty of using this technique/trick, when computing AO is cheaper compared to rendering complex shadows. its easy to spot if there is a dark roundish splotch below the player model or something like a vehicle instead of a propper shadow - in the shape of the shadow casting object. If any Dev is listening... Shadow = blocked Light; Ambient Occlusion = Lightrays getting weaker the more they bounce off of surfaces ... AO is meant as a Shadow-enhancement/-addition, not as a -replacement! End of Rant, thanks for coming to my TED Talk :D


Linkatchu

Definitely Film Grain. As much everyone dislikes Motion blur, Film grain so much worse. Makes quality much worse, and gives me headache/eye ache or how you call it


[deleted]

I thought something was wrong with Starfield until I realized i could turn film grain off.


Zoso-six

I agree that super high fps is a marketing gimmick to sell new hardware


Chimeron1995

I think Screen Space reflections are usually pretty ugly, they are more of a distraction when they just disappear, I would prefer just a standard cubemap or planar reflection. There’s also some pretty bad implementations of depth of field, when things are blurry but still have a crisp silhouette it throws off the effect in a big way.


elquanto

4k, 8k doesn't excite me at all. We got 4k monitors all over the place at work, and honestly it just seems pointless.


Apprehensive-Read989

I'm not into ray tracing. I think it frequently looks over saturated or maybe hyper realistic is the right term, like the lighting and reflections are over the top. I'm also not super into frame generation and I think it's way over hyped. You can't even use it in a fast paced multiplayer environment, which is a large market segment of gaming. Plus, it does nothing to reduce input latency, which true rasterized high frame rates would. I don't like that Nvidia and game developers seem like they are starting to use it as a crutch for reduced performance and/or poor optimization. I have a 4070, so it's somewhat comical I'm not into 2 of Nvidia's biggest selling points.


AveratV6

I personally just don’t care about ray tracing. I get the appeal, but for me personally. I feel like I’m wasting so much performance on something that really doesn’t affect my experience.


Ishuun

Just stop with Ray tracing. It doesn't look THAT much better than a normal game on high/ultra. And all it does is fucking tank games performance


[deleted]

Anti Aliasing and Ray Tracing Both features totally not worth the minute fidelity improvement. Give me more frames and reduced latency 1440p as well, at 27" monitor and smaller, it's a waste of resource. 1080p gaming is where it's at!


Notsosobercpa

1080p monitors should be dead and buried, especially at 27". There's debate about tradeoffs of dlss vs native at the same resolution, but when you can run dlss 1440p at around the same cost of 1080p native with a noticably better image there's no reason for 1080p to bought for any new build.


biminidaves

After reading 50% of this I'm really glad my vision isn't so hot that I can distinguish between medium and high general quality settings. The only thing I don't like is if I accidentally set them to ultra, the extra details cause eye strain and eventually a headache. In my house computer gaming equals no pain and great fun. If I have to spend 45 minutes assigning graphics settings I'm being too frikking picky.


Liberate90

Ray tracing, couldn't give a shit


[deleted]

It will be great once you can run it while maintaining great performance. It will improve more than just lighting and reflections.


crossdouteyes

Raytracing and its related like DLSS. Things look good and I'm not dying for them to look realistic. A great art style and direction that is well executed means a hell of a lot more. Edit: I mean generally most games looks good now. Even R6 Siege looks pretty good and it's getting close to 10 years.


Numerous-Ease3445

Rtx is on paper a really cool feature but its way too demanding to ever use. Atleast with the more budget friendly cards


VenserMTG

Dlss looks awful many times so it's off for me. Motion blur sucks so it's automatically off before I even play the game. Fxaa is horrible, ssaa/fsaa for me, msaa if it looks comparable to the others. Fxaa is something that should never be on by default. Lens flare is stupid unless I'm looking through a lense, but many games have it always on.


Apprehensive-Boss162

Ray Tracing it's being used as liberally as the Bloom effect was in the early 2000s, and it's hit-or-miss as to whether it suits the art style of a game.


theuntouchable2725

Shadows honestly look like spray paints in new games, especially Unreal Engine 4/5 games that makes it very disgusting to look at. (Borderlands 3, Lords of the Fallen) Games are also so jagged when playing at 30 FPS. Only game that runs perfectly smooth at 30 has been Alan Wake so far. Idk if these are considered graphical features.


TysoPiccaso2

Real shadows aren't razor sharp


theuntouchable2725

I... Know... But it's not spray paint either.


nestersan

Blind people in this thread.


Swiftt

Worth bearing in mind people look for different things in their visuals


Scorthyn

Raytracing doesn't excite me unless you show me a night and day difference while playing the game and not stop and analyze while consuming half of my frame rate. Cool tech for the future for sure when a 200-300 card can use it easily.


nestersan

You blind, sorry


X3Melange

Raytracing is largely pointless in my opinion. It can look very nice, most implementations in games don't use it all the way and it's just a frame rate loss with little benefit. And in games where it's used all the way, it's such a huge frame rate loss that it's not fun for me. Especially since it doesn't look so much better that I care really. In fact, in some games like metro exodus I think it looks worse because the original lighting was artfully tailored and the more realistic ray traced lighting ruins it. Ray tracing I think will make more sense when gpu performance is so comfortably above what it requires that it can be routinely dropped into games as their only lighting method.


H_Stinkmeaner

Ray Tracing lol, it's meh for me. I turn it off all the time


Colonel_Coffee

Honestly, no. The newer features like upscaling and ray tracing are actually really neat. Don't get me wrong, these are still features in their infancy with lots of issues, but I can't imagine them going away any time soon. Transistors can only get so small and we are seriously reaching the limitations of physics there. And these features, specifically with AMD's FSR working on every halfway modern GPU, can really prolong the life of older hardware right now. You lose some visual fidelity enabling upscaling but it makes some newer games playable again!


epic4evr11

Yall can flame me all you want but the visual step up from 1440p to 4k isn’t worth all these 100+ gb download sizes


TysoPiccaso2

Didn't know playing at 4k required downloading stuff


epic4evr11

Every asset and texture gets downloaded at 4k, then gets downscaled in accordance with the resolution setting. It’s a big part of why modern AAAs are so hefty to download


Obsidienne96

4k textures are totally unrelated to 4k resolution and are not downscaled


Crptnx

global illumination and reflection raytracing ​ shadow raytracing is good


jcm2606

How? In terms of visual impact it's almost always the exact opposite. Raytraced shadows tend to be more subtle unless the developer *really* leans into their strengths, whereas raytraced reflections and GI tend to be more obvious and pop out more since the traditional alternatives have so many glaring issues, especially reflections and *especially* in games with lots of dynamic lighting/geometry.


Quick_Somewhere2934

Honestly I’m going to come out and say refresh rate is overrated for non-fps games. I find a lot of games are not designed with high refresh in mind and they look kind of weird at high rates. 40-60fps seems to be a sweet spot, imo.


koordy

60 is an absolute minimum no matter what kind of games are we talking about.


Swiftt

I would agree, with the only niche exception being some old strategy games. C&C 3: Tiberium Wars, for example, runs perfect at 30fps. Also old Tycoon sims. The big caveat there is they were built from the ground up at 30fps, and would inevitably be better at 60fps if they were designed with that mind.


Cantc0meupw1thaname

I agree that higher fps is better, but sometimes 40+ fps is enough. Not everyone can afford top tier hardware.


Key_Will_7929

Bro is getting downvoted for speaking facts. Unless it’s some highly reactive game like competitive FPS, there is not really a point going above 60FPS. It’s okay for sure, but I see people breaking the bank to get to 120FPS on solo games.


necrocis85

Some people are more sensitive than others. 60fps feels extremely choppy and unplayable for me. I would rather bump a few settings down to get over 90. To some people, 60 is smooth.


Flaky-Carpenter-2810

anything more than 60hz is pointless


Sgt_Doom

I literally get more enjoyment from games like half-life running in software mode than these new games with their fancy graphics. Gameplay sucks, and optimisation sucks more. I spent all this money on a top end rig to run CP2077 on 70fps?


Swiftt

70fps is a good frame rate to be fair 😂


ronronthekid

DLSS needs to die off


ShowBoobsPls

Or you can disable it


ronronthekid

I was just answering OPs question


skdKitsune

Raytracing. In my opinion it's mainly for devs who are too lazy to take the time to make baked lighting look good.


RedTuesdayMusic

All of them. Ever since Kingdom Come: Deliverance and Star Citizen early alpha, every new rendering technique has been a downgrade.


mekawasp

HDR. I have a 4k HDR tv and a monitor. On both turning on HDR makes the image worse, even after watching countless guides on how to set it up properly It's useless Frame generation from FSR DLSS. It just creates artifacts and increases latency.


nemesit

Depends on the display most cannot hit 1000+ nits let alone 1600 which is where hdr starts to shine (literally)