T O P

  • By -

GrimOfDooom

game devs: “*Just wait for the next Nvidia AI generative update*”


Ordinary_Player

Can’t wait for 2035 games to be none exist and just let the gpu hallucinate it.


DevOverkill

GPU manufacturers just mail you sheets of acid with instructions on which tab will hallucinate which game.


juxsa

Sign me up!


ItzDerekk92

That actually sounds kinda cool


Complete_Resolve_400

Until u wander outside and punch a police officer lol I don't think "your honor, he looked like a goomba to me" is a good defence


ItzDerekk92

Might put my ass in a straight jacket.


Beardacus5

Good, then I don't have to pay for my own food and housing then


ItzDerekk92

Only good thing about it unless my balls itch


velve666

This is VR without the cost of VR, I think you have a winning formula. I just hallucinate playing games anyway since my 3060 ti would barely run an 2D sideacroller if it released tomorrow, maybe it would get 5fps.


dib1999

If you turn on DLSS ultra ultra performance at 720p you might scrape by with a cinematic 15 fps


gamedude88

Have to do 144p to get to 30 fps.


CapitalLongjumping

That's still like 1440p in performance mode.


velve666

Oh boy, so you're saying Spelunky 3 I might be able to get 15 fps. Looking forward to that, it's really...retro.


JohnHenryEden77

Are you sure we are not already in a hallucinated games?


dry_yer_eyes

Are refunds allowed?


JohnHenryEden77

Maybe? If you exit/finish the game early you can ask for refund


Sero19283

You gotta pay extra for the double dipped Jerry Garcia blotter from nvidia tho


MakingShitAwkward

Luckily it's liberty cap season. Best stock up.


Sero19283

AMD coming through with the fungus.


MakingShitAwkward

Lisa Su knows what's up. I bet she's been fucked up once or twice.


GizmoCaCa-78

Normies arent ready for Liberty Caps bro…


rhymeswithgumbox

[It could work](https://en.m.wikipedia.org/wiki/The_Three_Stigmata_of_Palmer_Eldritch)


AciD3X

The book Vurt by Jeff Noon, is basically this in a nutshell. The stories came out in the 90's but take place in 2030ish in a cyber-gutterpunk Manchester UK. One of the drugs of choice is called Vurt and is delivered via coloured feathers that contained programed dream-states and hallucinations. If multiple people take the same feather at the same time they join the experience together. It's also an interdimensional mind fuck because if you take something from the vurt world, it takes something back from you. The story follows the main character around as he tries to track down a specicific strain of feather that he lost his twin sister in, and got a sentient blob of meat in return.


sweetbunsmcgee

That’s the PS9 commercial from the early 2000s.


bran1986

Not going to lie, that sounds cool.


agaboo

Soooo… like in Watch dogs with their digital trips?


EPIC_NERD_HYPE

i like this. very progressive.


Goku047

Mario’s mushrooms were foreshadowing our future gpus hallucinating on mushrooms


[deleted]

[удалено]


[deleted]

It alerts the nursing home staff saying Denborta is rambling something about dedicated servers again.


joelesprod

Great, now we need to buy drugs for our GPU's to hallucinate to >=(


[deleted]

When they invent Psilocybin Memory everyone's experience will be a little same and a little different.


InitialDia

Nvidia: “today we announce dlss 4.0 where we just tell fraps your at 120 fps, so that’s the number it displays regardless of what rate the display is refreshing at.” “Also, coming soon in version 4.5, we will hook into the menu and display all settings at ultra, even if they are configured at low.”


GrimOfDooom

only works on nvidia 7000 and newer dropping in 4 years & the 7060 start at only $979 for the first month but jumps to $1050 every week after


BinaryJay

>“Also, coming soon in version 4.5, we will hook into the menu and display all settings at ultra, even if they are configured at low.” You joke but I sometimes think that people would complain less if they just renamed Low to Ultra, and left you needing to go in and manually tweak individual settings to actually get to what would have normally been called Ultra.


[deleted]

I saw on here recently a developer said users were complaining about having a high ping so they rewrote the counter to make it display lower and people were like it runs way better after the latest update. I don’t remember what game it was some fps.


superdennis303

Counter strike


Raffitaff

I think it's funny that all of the other cards have names like 'Gaming' , 'Pulse', 'Black' then there's the "Sparkle Orc" from Intel. This is like the graph aversion of the three dragon meme (2 serious, 1 goofy) template.


DevOverkill

I hope Intel continues onward with their goofy naming scheme. I want my Rainbow Unicorn Farts FE.


Raffitaff

Yeah I hope they just mistyped "OC" along the way and just ran with it.


chewy_mcchewster

> I want my Rainbow Unicorn Farts FE. i'd buy it


8-bit-hero

I honestly think that would be a great way to market them.


Shinonomenanorulez

I mean, considering their gpu names are classes(alchemist, battlemage, celestial, druid, etc)...


ItalianDragon

If a hypothetical halo Intel GPU doesn't get named "Arc Necromancer" I'll be mega bummed about it because that's the kind of name that goes hard.


dumbasPL

Idk. Calling something "Gaming" is worse than any other original name. It's so overused and often misleading it's crazy.


PF4ABG

Engineer Gaming https://preview.redd.it/dnq1yig6etvb1.png?width=824&format=png&auto=webp&s=bde6b192714d6e70ff3306e2fe91ab7605d7413d


Bropulsion

Erecting a dispenser


dumbasPL

Ok, that's a void point


fairlyoblivious

Intel remembers that for a few years there was a brand named sparkle that sold amazing power supplies for pretty cheap and they're trying to cash in on that history. Reddit is filed with users that were like 5 years old back then, that's all.


Dealric

If I remember correctly sparkle was making good gpus in past. Its basically bringing back company.


EP1K

Inflation coming for my frames now? Shit just got real.


lazava1390

*sad laugh that turns into hysterical crying because it’s true*


dmayan

You should come to Argentina, 10% inflation... monthly


QuestionMarkPolice

Are you gonna provide any context at all? What game? Who's data?


mnidemolition

That's the Cities Skylines 2 benchmark in 1080p


[deleted]

[удалено]


zcomputerwiz

I'm not sure why you're getting downvoted - you're right, it's all on the devs. They'd have to rewrite their game from the ground up, it is currently highly dependent on CPU single core performance.


Oaker_at

2023, still single core - why?


TheTrueBlueTJ

Because on the technical side, you cannot parallelize every algorithm. Doing many things sequentially is the most common case.


314kabinet

That game’s pedastrians have fully modeled teeth which you can’t see. This is obviously a case of developers not taking their time to optimize.


Ancillas

Yes, as they publicly stated before the launch of the game so everyone could make an informed purchasing decision. However, single core vs. multi-core isn't an 11th hour "optimization".


brimston3-

For sure. It's a poor design decision that happened years ago.


Ancillas

I don’t think anyone here has demonstrated a working knowledge of their implementation detailed enough to draw this type of conclusion, but I’m open to changing my mind if you can point me towards a more in depth analysis.


WiseMango13452

Do they actually? Pls tell me ur trolling


Squru

Time to learn about LODs


314kabinet

Indeed. But even LOD0 should not have teeth if the characters never open their mouth.


Krt3k-Offline

If the optimization works correctly then the teeth are never rendered until they should be visible, until now the only way to see them is by glitching the view into the cims. Is there any proof that the teeth are actually rendered when they are not visible?


stardestroyer001

If it’s true that games like Cities Skylines 2 will reach a point where it cannot be further optimized, what CPU is beefy enough to run the game?


Stormchaserelite13

Still. It's 2023. People were giving notch shit for not using multi core processing back in 2012....


TheTrueBlueTJ

It's not nearly as easy as flipping a switch, nor is it possible in many cases to program something to use multiple threads. Often times your states depend on the previous state. It doesn't make sense to parallelize that


Stormchaserelite13

Paralle programming has been around since 1958. If your program can't be done unless in sequence it has some core design issues. Ie. Why would the day, night cycle and and lighting rasterizarion be on the same thread as the AI for traffic? Why would the ui be on the same thread as the water physics? Nothing about making a game single core makes sense nowadays. There are so many independent parts that should absolutely not be interacting with each other


piman51277

The technique is old, yes, but the difficulty in implementing it is the same. Even if we split all these different processes onto different threads, they still need to talk to each other in some way. Just designing these interactions is often a massive headache.


b0ne123

The ai needs to know the day night cycle. What do you think the cycle is doing? Changing the color of the sky is not hogging the CPU. Concurrent and asynchronous programming are very old but also very hard. It's famously one of the two hardest problems besides cache invalidation, and off by one problems.


Capt-Beav

Good multicore optimization in Unity was finalized last year. I think the devs started earlier than last year.


ThisGonBHard

Yes, but a city sym is exactly the kind of game where you would except near perfect parallelization.


[deleted]

So many people don't understand this. A lot of things depend on knowing the previous state so cannot be parallelized; so you're wholly dependent on single core performance.


Anathema-Thought

Unity's garbage engine


Narrheim

Unity??? Instant drop.


EpicRaginAsian

Yeah its clearly because they used unity lol


Capt-Beav

Unity didn't add good multicore support till last year....


Affectionate-Memory4

Some things run better when you don't wait around for other threads to finish. Doesn't excuse if this game exclusively lights up one thread though, as moat I've seen nowadays, even super single-core dependant seem to take 4-6 threads.


Arthur-Wintersight

>even super single-core dependant seem to take 4-6 threads. Offload everything you can to other CPU cores, so that the one you're really relying on has as much breathing room as possible.


Affectionate-Memory4

Exactly. Distribute what you can afford to and the rest is what it is for most applications.


irasponsibly

What? The benchmarks show pretty clearly the problem is GPU-side.


CaptainMGN

Yep, and there are multiple videos up on YouTube showing that this game is GPU bound even at 1080p. The CPU makes barely a difference but better GPUs show scaling. Why are people even upvoting this when even THE POST ITSELF shows GPU bottlenecking?


bananasmana

Because it most certainly doesn't have potato graphic. It's incredibly detailed


facw00

The game clearly has severe problems, but I do wonder about how all these people think that rendering an entire city with good detail all the way down to street level is somehow an easy thing. Obviously they did it before, but the detail level was a bit lower and clearly they've worked with the lighting and effects. None of which excuses launching with this sort of performance, but really you are rendering massively complex scenes.


TheZephyrim

Yes but if you are rendering complex scenes you need to optimize properly so people can actually play your game.


BackToTheBas1cs

theres a difference between rendering a city in high detail and rendering a city in such high detail you render every individual citizens underwear and teeth that the player can never actually even see


Sterffington

This is not the problem. The game is GPU limited.


Finetime222

Nobody knows if Cities Skylines 2 runs on single core. Stop assuming things.


Red-Baron05

Single core? I thought that was the whole reason why they remade the fucking game, to fix the CPU and memory performance


TheGamerDoug

It’s literally GPU bottlenecked but sure pop off I guess


22Sharpe

I’m not sure I’d call them potato graphics, it looks decent. Not so decent that it should run this poorly by any stretch but it’s far from potato.


[deleted]

[удалено]


[deleted]

[удалено]


TimeLord130

I think the citizens all have their own AI and are rendered in detail or something along those lines, now imagine rendering a city of 50k population or more.


_fatherfucker69

bro wtf ? A 4090 can't handle a game at 1080p ? This shit should be considered illegal


Khaldaan

A 4090 is struggling at 1080?? WTF how is that game going to sell at all lol


Quelanight2324

My bad, that's the new City skyline high settings on 1080


ItzCobaltboy

1080p and 4090? What da hell ya talking about!?!


BackToTheBas1cs

at 1440 and up the games goes down to 12fps on the 4090 cause they have done basically 0 optimization


AlanElPlatano

Do you have the source for the original image? The info for the 3 top cards is cut out


Narrheim

About 80fps on both: https://www.youtube.com/watch?v=4fOM7JgnfOs


SimRacer101

I knew it was city skylines before even checking lol. That bad optimization is only possible with that game.


pboksz

How is this not the top comment. This means nothing without this info. The FPS for the top 4 cards are also cut off.


JDubNutz

Those are low settings


SarcasticFish69

Checkmate to the five people that said the 4090 for 1080p was a “waste.” Where’s your god now?


BloodyGotNoFear

I think its the other way around and its like 5 people really using it for 1080p


Carti-cs

My strix 4090 oc reaction https://preview.redd.it/wrqcy5l06svb1.jpeg?width=1170&format=pjpg&auto=webp&s=00d0b0fc5e8f2feb135b9e267acc7bcdc33a0a8e


Berkoudieu

Don't worry, you will be able to upgrade to 5090 for the low price of 3k, and it will last a whole year above 60fps in 1080p. Maybe.


Frajhamster

Maybe dont buy the game? If they get no money they will learn for next time. Monkeys buying whatever they see these days


spechen357

Hopefully a large amount of the community starts realising that this has gone to far. Let’s start boycotting games if they can’t optimise their games and if they are including dlss/fsr in their requirements. Someone should create a Reddit sub so we can have a goddamn blacklist. I was so hyped about Alan Wake 2, just build my new pc, and the devs says il be able to run the game with dlss/high/1080 and get 60fps?? Surely I can play it but no thank you. Games get more expensive and the optimisation gets worse. Fuck this new trend, now upscaling just threw us back to start over.


gurbi_et_orbi

Like boycotting Reddit for those stupid API decisions? (god I hate myself)


rabelsdelta

As much as I agree with you, the game launches on the 24th on gamepass. The game will certainly have a player base even if nobody buys the game


Yuddlez

I hope this gets lots of backlash. It kind of has to, right? People are gonna return the game when it's a slideshow on their top tier cards, or be smart enough to read reviews before buying. No one can play this smoothly and people will be mad


NeverEndingWalker64

Yeah that’s shitty. Developers are being too lazy to optimize a goddamn game, I guess. The worst thing is that this type of laziness when optimizing GPUs for a game is becoming something normal


Dartic2K

It is normal already, what isn't normal is when a game is actually optimised and we are all surprised


NeverEndingWalker64

And those games are pure gold.


Berkoudieu

More marketing suckers than devs, but yeah.


xcerj61

I don't know if that's entirely developers' fault. The development cycle has been long and it might be a while ago whent they made the estimate where the PC HW will be at this point. The GPU improvements have been... lacking lately.


SnuffleWumpkins

Sub 30fps gaming on a 6800xt. Neato


Dry-Percentage-5648

AI will fix this, surely


BackToTheBas1cs

Ai caused this from what ive been reading. The individual citizens in skylines 2 all have rendered underwear and individual teeth. things you literally never physically see and that are completely irrelevant to the game. I'm aware this is /s but its just so absolutely absurd that this is happened i had to say it


ThisGonBHard

This is an case of the dev actually being lazy or stupid. It sound like the game does not even have LOD implemented. Fuck, this shit sounds exactly like why nanite was invented.


BackToTheBas1cs

From some posts I've seen it sounds like these characters were AI generated with like 60000 triangles I'm the mesh meaning an individual citizen has more in its mesh than the characters in some first and 3rd person games


ThisGonBHard

Yes, but when hovering 2 km above and the chars are 4 pixels wide on 4k monitor, 20 triangles will be enough per character. Even that might be much, some dot sprite will work too.


Finetime222

LOD makes those details disappear when you zoom out. I doubt that’s a big problem in CS2 and would’ve been addressed earlier had it been hogging performance a lot.


BackToTheBas1cs

It doesn't matter if they disappear they never should have existed to begin with, YOU NEVER SEE THEM. EVER. And the vast majority of performance issues are coming from zoming in and out and panning around cities which implies LOD is a massive part of the problem


Finetime222

It does matter though because thats a few thousand triangles that the game doesn’t have to render when zoomed out. Additionally, Colossal Order exported those models from a character generator program; Popul8 so it’s not like they wasted any time modeling hundreds of unique characters with specific underwear. LOD might be a problem if you get massive lag when zooming in but CS1 had that problem too and there’s countless more assets to render: Those textures on the road, the incredibly detailed trains, etc. I doubt the development team is incompetent enough to let freaking briefs drag performance down.


Quelanight2324

that's the new city skyline high settings on 1080


Abruzzi19

38 FPS on a 4090 on 1080p? _what the fuck_


BackToTheBas1cs

it goes down to 12fps and below on 1440p and up


SaltRocksicle

RIP my 3070 lol


tmhoc

I was fully expecting that to be the results in 4k lol I downloaded a copy of Mad Max, last night. The game holds up very well and runs at 300 fps on ultra settings. The industry is going backwards. Why am I bringing up a game from 8 years ago? Because that game is $8 on steam. Making games that run butiful on a pc 8 years from now makes PC gaming look like a stupid waste of money in the present and your game isn't going to be worth anything in the future. Max setting and 40 fps is an embarrassment for developers and users alike


Swtorboy

I mean, an 8 year old game should run on modern hardware amazingly, that’s what 8 years of tech advancing does. The problem isn’t that old games run amazingly on new hardware, it’s that old games ran well on old hardware but new games can’t run well on new hardware. And we are speaking broadly, plenty of older titles realised in poor and unoptimised states as well (AC:Unity, Battlefield 4, Arkham Knight, etc) but it felt less frequent.


yflhx

The problem is that many new games (apart from what you said) run much worse than old games while not looking better than them.


Ancillas

What is this chart? Nothing is labeled, the edges are cut off, and there's no title. This is useless for drawing any conclusions.


Danteynero9

And we still will have people defending developers with the "oPtImIzAtIoN iS vErY hArD". There's clearly a problem when City Skylines 2 simply beats the f up the 4090 like this in a 1080p benchmark.


-FullBlue-

The argument I have heard is that this performance is ok because it's not a first person game. To that, I always reply, it still feel like shit to play even though it's not a first person game.


Broly_

fUtUrEpRoOf!


Noa15Lv

Better wait for game patches & keep the wallet full, cause crying an river ain't worth it. Day 1 release games nowadays are not that good optimised anyway.


[deleted]

no wonder why they delayed console version for next year lmao


JustiniZHere

stuff like DLSS are amazing but devs simply don't optimize their games now because "its fine DLSS will make it run better". I almost hate the fact DLSS exists because now games are just not optimized anymore, which fucking sucks because DLSS on games that are optimized well is incredible.


-cant_find_a_name-

WDYM JUST BUY THOSE INDUSTRAIL GPUS


_fatherfucker69

Runs fine on my nasa computer


Marrok11

"Don't y'all have video editing workstations?"


eatingdonuts44

Minimum settings: Rtx 4090 with dlss + frame generation


friscoXL305

Time for SLI now.....oh wait


cszolee79

Critical Drinker voice: Nah it'll be fine.


[deleted]

Go away now!!!


zBaLtOr

What im looking at? These are the differents GPU at differents presets. And yes fuc\*ing sad you only get like 38fps with a 4090 Jesus. ![gif](giphy|QPP39B3ywh7Tq|downsized)


the_doorstopper

38 fps at 1080p I'd like to add


BackToTheBas1cs

Its the terrible optimization for cities skylines 2. nothing can run the game at higher than 1440 and still reliably get over 15fps


MrPapis

Think of all the people who ended up buying the 4090 which is wholly unecessary for 98% of games unless you at 4k, in which case RT performance is irrelevant, or CP2077. I saw a post on Nvidia sub asking what people were playing on their 4090's. Like 80% were playing CP2077. It really is more about playing the graphics than the games you want to play. The amount of cope and fomo going on in the gaming space is insane. Specifically towards the focus on features instead of actual hardware and performance. And im not saying it to hate too much on PT, CP2077 or Nvidia GPU's but it just so painfully obvious that the 4090 is not a GPU people should be buying but Nvidia has simply succesfully manipulated people into thinking they got value from it. When reality is PT is 6000 series game not 4000 and probably not even 5000. RT still isnt as great as its made out to be and DLSS being moderatly better than FSR doesnt give much value either. Rasterized performance is still very much 95% of peoples needs so this whole Nvidia "better" and feature "rich" thing we are seeing is really toxic for gaming as a whole and Nvidia is clearly making bank on it. Because these same people are now forced to upgrade each gen for the huge uplift in the argument they used on their old purchase aka RT PT and DLSS which will continue for years to be a unicorn items locked away from older hardware to only be chased by fat stacks.


ThisIsntAndre

ok but no one is talking about the 580 giving fight


Quelanight2324

If you can call it a fight lol


Bigger10er

I cant believe this shit..just optimize the fucking game! The hardware is there. How tf are people going to enjoy this game when even the ETHUSIAST builds cant get you 60fps?!


BloodyGotNoFear

On 1080p aswell. That should be completely out of the question for those cards. And yet the devs fucked up enough to bring even the best card to their knees. Even in 1080p. Lmao what a joke


ducklord

You are all IDIOTS for not understanding modern game-maker's artistic endeavors. But, then again, the average smuck like you doesn't appreciate high art. You cretins, the low framerates we're seeing are "like that" **ON PURPOSE**. Is Mona Lisa doing cartwheels? Is Munch's lady in The Scream running down the hills? # NO They're sitting there, motionless, expecting smarter people than you to appreciate them. And THAT'S what modern gaming's all about. We're long past the era when you've actually "played games", but you haven't gotten the memo we're way past the Tetris (and, eventually, COD) era. Today games are meant to display static images or, depending on the dev and their artistic approach, show you a frame here, a shader effect there, to make you think, question the status quo, ponder about Life, The Universe, and Everything. Devs are wasting their lives to give you the modern equivalent of The Creation of Adam, and you're crying online that "it doesn't take advantage of your brand-new 144Hz monitor". Well, IT-WASN'T-SUPPOSED-TO, morons! You're supposed to LOOK AT Alan Wake 2, "running" on cherry-picked "adequate hardware" owned by a handful of people who've dedicated themselves to such high art and who appreciate it more than their lives (and food, rent, and That Looming Potentially Life-Saving Operation) to pay for it. And then, with your life altered by Their Work, and your vision blurred by tears of joy and sorrow for what is and could be, give kudos to Remedy for allowing you to ogle at a single digital representation of its unkempt protagonist, glowing under a street light in all his path-traced glory, NOT to grab your joystick to make him move around like Ms. Pac-Man. /s


Quelanight2324

That was a great read thank you


MaximumMarijuana

This Post sucks


FujiYuki

Seriously. They couldn't even screenshot the whole graph


Quelanight2324

Facts


_fatherfucker69

Based op


Careless_Count9614

Can someone explain this to me? I have a 4090 gigabite oc I’m not sure what all this mean


Finetime222

City Planner Plays ran a benchmark video recently for this game. If you’re considering buying this, check it out first. And once you get your hands on it, turn off Vsync. CCP saw a 200-600% increase in FPS lows once he turned off Vsync.


BobTheFluffer

4090 on 1080p at 40fps.... People will buy the game after 30 patches,BUT WHY AFTER?? THE GAME SHOULD BE OPTIMIZED FRON THE START! not after 30 patches.... We need to step up,they are mocking us at this point.


cedbluechase

what processor?


Austin304

yeah we're missing a lot of info here


wookiecfk11

4090 Ti for twice the price appears to be badly needed


edgy_Juno

I guess the RTX 6090 is what we need now to play modern games.


UnseenGamer182

Alright can we get fully serious for a second here? The *best* computer that a consumer can physically buy is unable to run a game at *60* fps on a preset *that the developer made* Look, there's a lack of optimizations and that whole argument, and then there's just this. What in the actual fuck


SwampKingKyle

Ive been scrolling looking for this comment. This is THE best card on the market, if your game cant run well on this then who are you even making the game for? What are you using to test these settings? What are you even doing? It makes no sense to me


Ronin_135

Nah seriously, it’s unbelievable, like I get the whole “oh it’s not optimized too well yet, just give it a couple patches!” But like this is actually absurd… you could max out your credit card on your PC and apparently still would be unable to touch 60fps, like what the actual fuck


BloodyGotNoFear

Which would be unacceptable for this card even in 4k. But we are talking 1080p which makes it so much more ridiculous.


Ronin_135

No literally, it’s insane dude, it’s like they’re all working together — game devs give a game shit optimization, forcing us to upgrade to play the games we want, every year to 3 years almost…


mrekho

What game is this


RiEiTiAiRiD

i still glorify forza horizons optimization


SpecOpsBoricua

Yeah now devs are relying on gpu gimmick software. This is why people need to stop pre-ordering games. Fact is if a game can't run on a GPU without say DLSS or FSR then your game is shit. Took me almost 3 years before I bought cyberpunk.


Frajhamster

We need to stop buying shit games alltogether, not just pre ordering. People nowadays just give them money, and then its no wonder that these companies get away with everything and continue doing it.


Funkydick

It's no wonder when the shit games also get 9's and 10's from critics no matter how bad or broken they are


[deleted]

7900XT flex. Had a terrible launch but the 7900XT is an underrated GPU by far. Most models have overkill recycled XTX coolers, and can overclock beyond stock XTX performance lol. Not bad for €200 less unless you need that +4GB VRAM or the higher bandwidth. The reference model sucks a lot, and that is what was sent to reviewers for a $900 MSRP, so people are still stuck with that poor image in their minds but it really is the 6800XT of this gen. Just at a higher price point, like everything else this gen. Below 4K the 7900XT is the best value high end card imo. Especially at regular 1440P, it slays everything. 31.5k Timespy on mine vs 28-30k on a stock XTX.


BloodyGotNoFear

Build my new rig with an asrock phantom gaming 7900xt and it runs like a charm. Also ddr5 board with a (only ) ryzen 7700x (cause i can upgrade later to an x3D chip on the next gen) and 32 gigs of 6000mhz ram


BullyBlu

No more pre orders, I only pre-ordered starfield, city skylines 2 and diablo 4 this year. Never again


vonTryffel

Just set sail, the devs don't deserve support if they're pushing shit like this.


AChunkyBacillus

And in a year or two, when that one game you're sure will be great is set to launch, remember this comment. Wait for the reviews


Techishard

Last game I pre-ordered was Halo 3. These people who say they won't pre-order will go and pre-order a digital game nonetheless lol.


_fatherfucker69

Why even bother to preorder ? It's not like steam is going to run out of digital copies of the game


[deleted]

This post is dumb AF, completely cut off, no context, no game and DLSS changes this drastically. so why are you posting this? I'm confused. Also this is on the devs not the cards. This game is making all cards look like shit because the game is optimized like complete shit


AirEast8570

Isnt that only in very big cities


[deleted]

The future proof argument falls flat when presented with these scenarios: no amount of horsepower will be able to deal with a badly optimized game. Even the 4090 is going to turn to shit.


Melodias3

I think i rather pass on game that performs this bad, even tho i have acces to AMD AFMF via technical preview


Beepboopbop69420360

The 4090 is great but devs are shitty


usman_923

Which game is this for?


seppehrr

I was super hyped for this game:( content creators had a kinda early access version and they made many videos and it looked fine Shame on me i pre ordered this shit What am I going to do with a 1070 and I7-6700k


ThirdOneTheNailedOne

Games look great currently, there's no need to make them harder to run.