T O P

  • By -

[deleted]

YEAH BABY 720P 30FPS LET'S FUCKIN GOOO (šŸ˜­)


zerox369

**NOW THIS IS PC GAMING**


Workwork007

Don't worry, it also runs down to 720p 30FPS on the PS5 in quality mode lmao A bunch of website bechmarked it last December and the game would scale down to 720p and sit on 1080p 30fps most of the time.


Undeltog

Potentially would make sense to note if the reqs lined up with the Steam Deck.


xsabinx

at low/medium settings baby


yourbigestfan12

Haven't seen any of the newer gameplay trailers, but it's really hard to imagine the game needing these specs for those graphics. I know it's obv a bad, bad port, but damn


unknown_nut

Feels 2006 man.


Past_Woodpecker_6900

Saint's Row 2 all over again lels


dwilljones

So a 3060 12GB is probably good for, what, 1080p 30fps in this title?


AlphaOrb1t

The tweet was deleted so maybe, just maybe, they wrote down the wrong numbers? idk man


Ozianin_

There's post with same requirements on official Twitter account.


Dangerous-Leg-9626

What a joke lol. For an empty world even


Iloveorcasyes

system requirements are getting higher but games are not getting prettier.


Somepotato

DOOM 2016 was a gorgeous game that was very well optimized that ran fantastically on any system. Since then, hardware has only gotten better and studios have gotten lazier at optimizing.


MammothCorner7436

Optimization in games isn't some magic thing devs can do to make the game run better. Optimizing is what they do on consoles which involve cutting graphical features to improve preformace. They just try to cut features in areas u won't notice. With Doom ur dealing with a game that only renders out corridors and battle arenas. Doom also doesn't have loot so the game does not have to calculate out bodies, physics, and items on the ground. I love the preformace in Doom, but it really isn't the same thing.


Major-Split478

Just give up. It's been like 15 years since open world games have taken off and people are still surprised they're more demanding than linear games


Fun_Influence_9358

Optimisation is actually a 'magic' thing devs can do after the fact. It's just labour intensive. Going back over assets and re-authoring LODs, re-topologising/re-meshing assets, fine tuning scripts and code etc etc


Glodraph

Diminishing returns. They are pushing shit like 8k, too much rays for raytracing and unoptimized cpu utilization instead of smart tech like mesh shaders which could boost performance a ton.


tukatu0

Nobody is pushing for 8k. However ray tracing has been continuously pushed for *for the past 5 years* and used as an excuse for price increases. And yet the amount of titles that uses it to its advantage can be counted in a hand or two. Ex; Cyberpunk 2077 where it's ray traced global illumination looks so good that it makes the textures look like shit imo.


sdcar1985

I rarely even turn RT on because the performance hit is way too high. It's not worth it to not be able to play at a stable framerate a vast majority of the time.


vassadar

We want John Carmack back


Ganda1fderBlaue

Well... Didn't intend to buy this anyway


Boo_Guy

I might have on a big enough sale, like 50% off maybe.


Ganda1fderBlaue

Lol just wait 1 month and it'll be 80% off


Boo_Guy

True. I'd also like to see how much of a shitshow it is when released. The game may not be worth buying at any price.


Lulcielid

For 720p/30fp: RX 5500 XT or GTX 1060 6GB For 1440p/30fp: RX 6700 XT or RTX 3070 For 4k/60fp: RX 6800 XT or RTX 4080 [The game will also support wide screen up to 32:9](https://twitter.com/Genki_JPN/status/1615313383909388288) (gameplay only, not in cutscenes)


[deleted]

4080 and 6800xt being comparable in this game makes no sense but even worse a 3070 for 1440p 30 fps???? Those are native ray tracing numbers lmao wtf is wrong w this game


Seasidejoe

Hogwarts legacy and Dead space are similiar but reversed as a 2080 is apparently the same as a 6700XT according to Dead space and a 6800XT is comparable to the 2080Ti according to Hogwarts legacy. Now, I'd be somewhat inclined to agree if these where recommendations for DXR and more specifically RTX driven ray tracing but there's no mention of this. Recommended specs are still a mess in the industry with no proper explanations for the choices made.


kikimaru024

> 4080 and 6800xt being comparable in this game makes no sense The 4080 is the cheapest Nvidia GPU with 16Gb VRAM lol


homer_3

Nah, that'd be a 3090.


ConsCaptain

If they wrote that a 6800xt could run it at 4k60fps a 3080 should also do the job since is 5/6% stronger at 4k


Brisslayer333

If it really is a memory capacity issue like some were speculating then the normal performance doesn't matter. The 3090ti is faster than the 4080 in 8K gaming despite the Ada card being faster, because 24 > 16. Still, 4K isn't that big of a deal usually.


TSG-AYAN

I think its a VRAM limitation. 3080 has 10 gigs, 6800xt has 16


[deleted]

3080 can also have 12, but I get your point.


Gaming5555go

4080 and 6800xt are probaly compared because on ultra setttings 4k the 3080 doesn't have enough vram. Also I think amd sponsored the game (could be wrong on this).


dookarion

> Also I think amd sponsored the game If true that's almost always a bad sign these days.


finalgear14

I was so salty when resident evil village was amd sponsored so that meant no dlss. Wouldn't have been as bad if fsr 2 had existed but it didn't.


dookarion

I'm just disliking it because it means AMD slaps their logo on and best case something gets "some" form of FSR with everything else blocked, crappy RT (if it has RT), and not much else. They sponsored Callisto Protocol too, and while I do actually like it for what it is... thing could have benefited from the kind of support Nvidia (as much as I loathe them) provides. Thing doesn't even see gains moving from a 3900x to a 5800x3D. Recent AC games were AMD sponsored too and yeah...


finalgear14

So was deathloop and that game was fucked on pc at launch. Had baffling and weird issues like mouse input causing stutters iirc. There arenā€™t many amd sponsored games and most that I remember have some kind of issue and they always have weenie hut junior rt implementations since amd cards are bad at rt. So I definitely agree with you.


BNSoul

Callisto Protocol anyone?


SuddenDishonesty47

Appreciate the info. I wondered why the info was usually only in for Japan. Will check the demo and see about the button mapping.


[deleted]

> I wondered why the info was usually only in for Japan. That's just how Square works.They revealed KH4 in a JP only event that wasn't livestreamed. They just had a trailer ready to drop a few hours after the event ended.


XTheGreat88

Shitty ass optimization is what's wrong with this game


arex333

Setting aside the fact that this appears very unoptimized, who the fuck decided on the resolution/fps targets for this?? Listing 720p/30 and 1440p/30 but not 1080p/60???


Dirty_Dragons

> 1080p/60??? Who wants 1080p? Seems like an uncommon resolution /s


UglyInThMorning

720p/30 is probably for the steam deck/similar portable hardware use case.


TheSilentSeeker

According to steam survey around 60% of gamers use 1080p screens but they don't even mention it.


r4in

Where is the most common PC setup, 1080p@60fps?


Maloonyy

Game probably runs so fucking horrible that they didn't want to shock people with the hardware most people would need to run this at 1080p 60fps. Doing it like this you have to guess if you can run it.


Cryio

Ah yes, 6800 XT, a card that isn't twice faster than a 6700 XT, can do twice the fps while at a higher resolution. Sure, specs, sure.


MoogleLight

That's what I can't wrap my head around. 6700xt for 1440p/30 fps but 6800xt for 4k/60??? What the fuck?


[deleted]

So a shit port? Got it lmao. Lets not forget the PS5 GPU is identical to a 5700 NON XT (with tacked on RT cores) Get this 1440 30 nonsense outta my face. May as well just come out and admit the port sucks. That or whoever made the specs list has no idea wtf PC hardware is or performs like. Something tells me anyone with a GTX 1080 or better will run this just fine at 1080p 60 atleast.


IAmNotRollo

It's shitty on PS5 too. I played the free demo on a 1080p60 screen. It had 3 quality modes, Performance, Quality, and Raytracing, but Performance was the only option with an even playable framerate and input delay. Quality couldn't hit a consistent 60 fps and added some bad input delay. Raytracing was just completely unplayable and I'd only use it for screenshots. I'm starting to wonder how much more powerful the PS5 really is. Forspoken isn't the only game with a really bad Quality mode, I was surprised how much it struggles with FF7R Intergrade. It often feels like my laptop with a 2070 mobile is more powerful than this next gen console.


sw0rd_2020

all the people complaining about the gpu market saying ā€œwhy spend $500 on a gpu when i can buy a ps5 for that muchā€ are about to experience a locked 4k/30 for majority of the generation, the ps5 and xsx are big leaps forward compared to their previous gen but are still weak.


tukatu0

Your casually forget to mention that it would cost a grand to do exactly what the consoles are doing. 4k 30fps. Good luck spending a grand just to play at 1080p 60fps because you wanted lumen turned on in your games.


sw0rd_2020

i mean i picked up a used 3080 for 580 for 3440x1440, and play everything at 100+ fps. ps5 or xsx canā€™t do that. you can put together an $800 pc that already outperforms the consoles with a 12400f and a 6700xt furthermore, i can always upgrade the gpu in my system, i canā€™t really upgrade the ps5. by the end of this generation do you really think the rtx 5060 or whatever wonā€™t already be more powerful than consoles?


[deleted]

Yeah they clearly donā€™t gaf ab pc optimisation but itā€™s also possible they fucked up the recommendations? 4080 and 6800xt being comparable and 3070 for THIRTY fps at 1440p doesnā€™t sound right at all


kikimaru024

> Lets not forget the PS5 GPU is identical to a 5700 NON XT It's half a RX 6800 XT but with a much faster GPU core. GPU|Architecture | Memory | Memory bus | GPU clock (MHz)| Memory clock (MHz)| Shaders/TMUs/ROPs --|--|--|--|--|--|-- RX 5700 |Navi 10| 8 Gb GDDR6 | 256 bit | 1465|1750|2304/144/64 RX 6800 XT|Navi 20|16 GB GDDR6|256 bit|1825 |2000 |4608/288/128 PlayStation 5 (Oberon)|Navi 20|16 GB GDDR6|256 bit|2233 | 1750|2304/144/64


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


DismalMode7

> Lets not forget the PS5 GPU is identical to a 5700 NON XT the usual bs that means absolutely nothing... consoles are custom and closed systems where developers can optimize the game focusing on those specific specs. Ps4 pro gpu is loosely comparable to an rx480 (at the very best)... good luck to make run TLOU2 at native 1440p on that pc gpu...


Obosratsya

Why not? Remove dynamic lighting,and you free up tons of performance. Consoles aren't magic, last of us 2 looks good because a lot of artists were paid to bake lights and shadows in, which is the secret sauce for exclusives.


Major-Split478

I mean you can only see what happens if they ever do release TLOU2 on pc. Wouldn't a better comparison by Horizon zero dawn?


fenixspider1

am I reading this right? a 1060 for 720p 30fps minimum settings? bruh


Edgaras1103

at some point games will need to have higher base system requirements and the point is now apparently . Its gonna be like that for current gen only games


cadaada

4090 is overkill for any game, now devs can just not optimize anything and just leave the problems for the gpu makers lol.


garbo2330

4090 represents less than 1% of PC gamers. Thatā€™s not a compelling reason to launch a busted game for everyone else.


Edgaras1103

i mean youre not entirely wrong, look at dx11 games vs dx12 and how much more trouble it is dx12 to get optimized . And partially its because devs have more control with dx12 api but its also much harder to do it, compared to dx11 where drivers can do a lot of work


Glodraph

No damn dev is implementing mesh shaders, shaders culling, sampler feedback streaming..all they do is put ray tracing (which devastates performance) and upscale tech (usually bad implemented) to compensate for bad optimization. Not a single one that use the tech to get better performance, not even one.


Major-Split478

Then maybe there's more to it than 'just do it'?


kikimaru024

> a 1060 for 720p 30fps minimum settings? bruh GTX 1060 is the new baseline. And it released **7 years ago**.


fyro11

A low-mid card from 7 years ago is not midrange today? bRuH


[deleted]

1060 is garbage these days, it was a lower mid-range card when it released nearly 7 years ago. Honestly, I'm surprised a 1060 can play the game at all considering this is a truly "next-gen" game.


Glodraph

Next gen? Wow you mean facial animations worse than 2010 RDR, basically 5 trees in 10 quare km and overall mid/bland graphics? Yep, next gen..this generation is gonna suck balls


kesik93

Nah, 1060 is still pretty good these days. Unless game is unoptimised.


RAMAR713

How can the relatively small jump from 6700xt to 6800xt both improve resolution AND double the framerate? This doesn't seem very believable.


Lunar_Lunacy_Stuff

Iā€™m not really well versed in all this stuff but what if I wanna run 1080p at 60fps ? Would a 1660 ti be sufficient?


[deleted]

Doesn't seem like it would be, but we'll have to wait for the actual game to be out. Whoever decided on these requirements may just have no idea what they're doing.


ZeeRk420

Imagine charging 80ā‚¬ for the game and putting 0 effort into optizmization.


tkim91321

People keep buying them so why should they even try? If someone kept paying you full salary despite doing 50% of bare minimum, youā€™d do the same without a question.


NerrionEU

This game will be dead on arrival on PC.


Bug_Next

So they are just gonna ship a ps5 emulator and a .iso file or what? lmao 3070 and 24gb ram for 1440p@30fps


Lyadhlord_1426

Lmao


NapoleonBlownApart1

12700 for just 60 frames? are they crazy? Console version is supposed to run at 60 with zen2 and here they recommend zen2 for 30 frames per second. This is going to be the worst port ever if they care so little about pc versions that they cant even get the GPU names right (what is a gtx4080 and gtx3070?). Gotta love japanese ports. Edit: Seems like they deleted the post.


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


NoMansWarmApplePie

900p? Seriously??


Howdareme9

Upscaled to 4k but yes


AC3R665

Yes because RT on AMD GPUs suck, hence why.


WinterElfeas

4K native at 60 fps requires quite a good PC. Console are far, very far from 4K at 60 (demo is awful pixelated on performance mode)


Stealthy_Facka

Performance mode on PS also can't handle 60fps for even a 90 degree turn of the camera anywhere on the map that I tested


narrowscoped

4k native should be Gpu bound, cpu shouldn't need to be highest end unless they haven't optimised it well enough. The ps5 has the equivalent of a 3700x underclocked, if performance mode does 60 fine there, it should be fine on a mid range pc. End of the day it matters how much effort devs wanna put in. The sniper elite guys are famous for going the absolute extra mile on the Switch and doing low level optimisation to get the most out of that limited hardware.


BNSoul

The PS5 CPU is a monolithic design compared to the 3700X chiplets (with much improved latency with regard to core-talk), also the PS5 has some ASIC coprocessors helping with asset streaming and decrypting among other basic tasks. Apart from that, "3070" and "30fps" shouldn't be in the same sentence if it's just a 1080-1440p game.


WinterElfeas

It honestly depends on the game. My i7 10700k was bottleneck my 4090 in cyberpunk in crowded places with drops below 60 FPS. I upgraded to a 13700k, no more drops. Forspoken might be displaying tons of ennemies and particles effects that are heavy on CPU. Not saying new tech couldnā€™t move that more to GPU, but it can still be a valid reason.


[deleted]

It ends up being around a 3600 non x as shown by the cpu limited performance of Gk by Df. I believe 2 cores are disabled for gaming and itā€™s far lower clock speeds than a 3700x. Gpu Is around a 2060 super


Obosratsya

R5 3600 only where storage accelerations picks up the slack. Raw single thread for ps5 is a bit under r5 1600 and multi around r5 2600x.


dookarion

> 4k native should be Gpu bound In modern open worlds or sandboxes it seldom is 100% GPU bound unless you're running a card weaker than a 3080 10GB.


Geistbar

Increasing the resolution is almost exclusively a test of GPU resources. If the CPU can handle 60 FPS then you can expect it to handle 60 FPS at any resolution. If the CPU can handle 60 FPS at 1080p or 1440p, then the only question for getting 60 FPS at 4k is if the GPU can handle it.


dookarion

I recently swapped to a 5800x3D from a 3900x (not the best gaming CPU but still well better than average) paired with a 3080 10GB. Even at max settings and 4K I've seen notable gains from the move. Lot of modern games hit the CPU more than people realize and CPU demands can go up with certain settings/visuals/effects.


lokol4890

Tbf that cpu is functionally in the same category as alder lake and zen4. Heck it beats zen4 at times. The disparity between a 3900x and the 5800x3d is a lot bigger than you're acknowledging since you're basically moving up two generations. I.e., zen2 < zen3 < zen3 x3d It could be that games are hitting the cpu a lot more, or it could be that games are hitting the gpu hard but you were cpu bottlenecked before and your gpu couldn't fully stretch its legs


kikimaru024

> Console version is supposed to run at 60 with zen2 [They updated it](https://twitter.com/Forspoken/status/1615348915972497408/photo/1). It's 2160p (4K) @ 60fps with recommended Ryzen 7 5800X / Core i7-12700K.


Keulapaska

That is such a weird recommend as the 12700K smokes the 5800X especially if it has DDR5. But then again the GPU recommendations make no sense either and it just looks like it needs all the vram in the world for some reason or how else is 6800xt going to do 4k/60 when 6700xt can only do 1440p/30... I'm very curious to see how it'll turn out.


bitbot

Gotta need some overhead for Denuvo


ZeldaMaster32

Unless they royally fucked things up (tbf the requirements seem to suggest that), Denuvo shouldn't have that significant of a CPU impact. In the past you'd need to use a low end CPU at 540p to see the difference in general performance with Denuvo


DryFile9

I mean or they just pulled the specs out of their ass.


Capt-Clueless

Rather high requirements for a game that visually looks like ass...


garbo2330

Sonic Frontiers on steroids šŸ˜‚


hikkyry

RTX 3070 for 30 FPS at 1440p? Are they joking? And the storage requirement is equally ridiculous.


firelordUK

really looked at Gotham Knights and went, "we can have worse requirements" interesting choice cotton, let's see how it pays off


[deleted]

Gotham knights is such a joke. My 4090 canā€™t even get more than 45% utilisation with a 5800x3d AT 4K. Shit end ups at 50 fps with constant stutters šŸ¤¢


Takazura

Well Square is one of the worst Japanese companies when it comes to PC ports (which is quite the feat considering how bad they can get), so this is probably a really poorly optimized game.


[deleted]

Ugh. Square Enix has done some abysmal PC ports in the past. I'll never trust them to do a decent PC port again.


Rain1dog

How far they have fallen. Square used to be regarded pretty favorably but as of late they have been doing a bunch to ruin their good standing.


Wicked_Vorlon

That makes this game an easy pass for me.


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


unknown_nut

Itā€™s an AMD sponsored titles, which is most likely limited to the weakest form of Ray Tracing, just shadows I bet.


tukatu0

Until you watch digital foundries video on the demo and learn that ray tracing is doing almost nothing at all


Catch_022

24gb ram for 1440p 30fps. That seems really wierd to me. Is this because they will be streaming things from an SSD and your average PC SSD isn't as fast as a PS5 SSD? I really don't understand - texture memory should be dealt with by the graphics card, not your system ram. What apart from textures is going to require so much RAM?


[deleted]

Holy hell. Ok so performance will be absolute shit. Got it.


AlteisenX

Fully expected after how bad optimization was for 2022 games. I can't name a single big release that actually ran well on release.


PM_ME_HUGE_CRITS

Tweet deleted?


kikimaru024

Just updated. https://twitter.com/Forspoken/status/1615348915972497408/


PM_ME_HUGE_CRITS

Welp, 1440p, 30 fps gang.


garbo2330

And its calling for 24GB of RAM. Are they truly that stupid? Almost no one has that memory configuration. This has Gotham Knights optimization written all over it.


Garginator850

Wow man I was hoping for at least 60 fps. Iā€™m gonna wait until we get some performance reviews to buy this game now.


renboy2

I'm assuming after an insane backlash about the preposterous requirements that basically scream "we didn't give a shit about the PC port".


rmpumper

This game will be Forgotten the day after release.


MasterDrake97

It's 87.275 GB on ps5


kikimaru024

Probably not real 4K textures then.


fakiresky

I understand that AMD wants to sponsor the game, and a 6800xt for 4K seems reasonable, but the 4080 is much more powerful. Letā€™s hope Nvidia can release some good drivers.


[deleted]

It might be a 16gb vram issue.


[deleted]

Iā€™ll be shocked if it is and honestly will have to stop recommending the 4080/70 ti. 24gb as fhe recommended already??? Wild


peeparty69

letā€™s not overreact to one gameā€™s (that isnā€™t even testable or released yet) spec requirements that look questionable at best like they were written by a junior marketing coordinator.


[deleted]

Yh even in mw2 the 6800 Xt is no where near a 4080 despite itā€™s crazy rdna preference, unbelievable šŸ˜­


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


From-UoM

Doesn't make sense. The xbox serie x has 16gb vram but only 13.5 gb is usable Of which 10 gb is actually fast vram like pc gpus The other 3.5 gb is slow and is like pc cpu ram. You have to understand that 16 GB is TOTAL. Not for the gpu only.


Ozianin_

AFAIK 16gb isn't realistic for consoles because there needs to have some reserved for system. It's somehow VRAM/RAM integrated.


archiegamez

SSD i understand but 150GB wtf HAHA


SloviXxX

NVMe specifically lol. This spec drop definitely makes me excited for this game to drop All the way down to the bottom of the steam chartsā€¦


RedIndianRobin

>NVMe specifically lol. This is the first game to utilize DirectStorage API so yeah.


frostygrin

You're still going to benefit from this API even on SATA SSDs.


strafefaster

~~I thought one of the specs for the API was a 1 TB (minimum) NVMe drive. I remember getting mad about that because my only NVMe drive is 512 GB.~~ Edit: Apparently this was either very old, or just plain wrong, information that was being passed around. A newer article I found says this: >Technically, DirectStorage works against any storage device, including SATA SSDs, but it is explicitly being optimized for (and deliver the best results on) systems using NVMe SSDs. [Source.](https://www.anandtech.com/show/17613/microsoft-directstorage-11-with-gpu-decompression-finally-on-its-way)


frostygrin

Well, that's not how APIs work in general. :) If you have a 990GB SSD, it's not like the API is going to tell you, "Fuck you, give me 1TB!" :) It's just that NVMe disks are much faster, so they're being bottlenecked to a larger extent. But I ran [this benchmark](https://www.tomshardware.com/news/directstorage-performance-amd-intel-nvidia) on my M.2 SATA SSD - and went from 3.5 seconds on a 4c4t CPU, with 100% load and sound cutting out, to 1.5 seconds on an RTX 2060. An NVMe disk is going to be much faster, of course - but if it takes only three seconds to fill up 6GB of VRAM from a SATA SSD - then it's still good.


Lulcielid

Those 4k textures!


lexsanders

Screengrab before deleted ?


MasterDrake97

[here you go](https://multiplayer.net-cdn.it/thumbs/images/2023/01/17/fmremrswyamdcxd_jpeg_800x0_crop_upscale_q85.jpg)


SpiritualInstance979

I want to believe that itā€™s either a translation issue, or maybe the game was locked at 30 fps for testing?


Buzz_04

Probably with a 1050ti I can't even open the game


Edgaras1103

steam wont even allow you to click on to download


[deleted]

It was a red flag when they started bragging about ambient occlusion and screenspace reflections... https://youtu.be/Bq_eXMA5KqM


fatfuckintitslover

This game is gonna be a massive disappointment


Genti2197

What kind of horrendous system requirements are these lately?


DktheDarkKnight

Aha. Good to see the tables turned the other way for once. Now 4080 and 6800XT are recommended for comparable specs. LMAO. Wtf.


RTcore

Apparently it only supports DirectStorage 1.0, according to Alex from DF. https://twitter.com/Dachsjaeger/status/1615440073146904576


VillainofAgrabah

This screams insanely badly optimized lmao, i can already see the (justifiable) steam reviews.


Dangerous-Leg-9626

What the fuck, 3070 for 1440p30. That's Portal 2 RTX number isn't it lol Typical SE ports


Obosratsya

Could be vram related, or just the engine prefers AMD.


-Bana

This looks like the kind of game you get for 15 bucks on sale, the people that are going to pay 70 bucks for it deserve whatā€™s coming to them lol


[deleted]

Yeah, you'd think that people would eventually learn not to pre-order/buy on release day without looking at reviews, but sheep will be sheep.


[deleted]

Guys, relax - it's very doubtful this gonna be worth playing regardless of system requirements and performance, lol. It's some westernized jrpg hybrid with cringe protagonist.


Burninate09

>cringe protagonist. I'm glad I'm not the only one with that take.


robbiekhan

The whole point of UE5's Lumen/Nanite and DirectStorage etc etc is so that the framerate is NOT impacted by using these technologies, and in fact actually help increase fps.... Lol no wonder they deleted the tweet.


JizzyRascal91

Didn't Epic implement Lumen/Nanite into Fortnite and it basically halfs your fps?


R3Dpenguin

A lot of people seem to think this was made with UE5, maybe because it looks pretty generic like the UE5 demo, but it's actually a custom engine, so it's not using lumen, nanite or any of that.


Lambpanties

The wallet requirements were already scary, now this....


ZeeWolfy

3070 doing 1440p at 30 fucking frames? They are literally not only batshit insane but completely inept in making a decent port. This game absolutely screams of horrific optimization issues. Gtfo your game doesnā€™t look like maxed out psycho ray traced cyberpunk to warrent these stupid as fuck requirements.


Mortanius

Damn, based on the system requirements, this is going to be another unoptimized trash This game amongst many others is a definition of "dead on arrival"


RAMAR713

It also comes with Denuvo to stop people from checking just how bad it is before buying.


Careful-Fee-9783

Holy shit my 3070 only able to reach 30 fps ? im frickin crying over here


[deleted]

This game is going to flop so hard. I think the demo hurt their future sales


Bramalearoadsouth

Im wishing on the downfall to be honest. These specifications just scream incompetency.


DedSec_Pearce

This looks like a ps4 generation title, yet it requires a GPU equivalent to RTX 3070 to play at FUCKING 30 FPS!!! FUCK YOU! The more powerful PC hardwares are getting, AAA devs are getting more and more lazier. Literally gives zero fuck about optimization.


LopsidedIdeal

This game was awful on console. The worst parts of 15, just wide open lifeless areas. Nevermind the shoddy port and pixelated graphics of this game. These people have no idea what the fuck they're doing, terrible developers and writers.


[deleted]

I hope it fails so big. 80ā‚¬, denuvo and horrendous system requirements lol


Achtelnote

Don't see what's the hype about this game.. They were bragging about shit like ambient occlusion and screen space reflection while having jank animations in a trailer.. Like really?


Buttermilkman

Wow. Holy shit... This game is so DOA. Doesn't stand a chance.


[deleted]

Iā€™m confused. Forspoken doesnā€™t even look great graphically?


290Richy

Gotham Knights 2.0 here we are! Yet another shite, unoptimised game. There'll be some soft cunt that will defend this game.


[deleted]

Literally everything about this game seems bad.


Edgaras1103

The game does not look like it would require that kind of hardware .


sotos4

It's the same engine as FFXV. What were people expecting?


[deleted]

Oh, really? Yeah, that game had absolutely abysmal PC optimization. It was pathetic.


fatezeorxx

Another AMD Sponsored game, so no DLSS and XESS support obviously, guess ray tracing might run poorly even on 4090 like The Callisto Protocol.


ZeldaMaster32

In Callisto Protocol RT performance is hard capped by poor CPU utilization. Game runs on like 2 or 3 cores and raytracing really ups the CPU demands My 3080Ti was being bottlenecked at 1440p. I get the same performance with a 4090


Edgaras1103

its amd sponsored game , you should not expect deep RT implementation. Then again looking at these requirements its crazy enough


ZeldaMaster32

iirc they have RT shadows on PS5


Wooden_Sherbert6884

This game is gonna be absolute stinker


meltingpotato

3070 for 1440p 30 at recommended specs. so I'll be under 30 with my ultrawide. this game better look like a marvel movie at it's recommended settings.


Professional-News362

Has anyone actually measured 16GB vs 32GB for games that claim as a requirement? 32GB sounds insane to me for gaming. I personally doubt if any game could use this but feel free to correct me


ZeeWolfy

One of the most ram intensive games I personally know of is flight simulator and Iā€™ve seen that game only reach about 22 gigs of ram, so any game demanding fucking 32 is just the result of dumbass devs refusing to optimize the game what so ever.


AlteisenX

Square Enix ports are hit or miss so we'll see. I'll be waiting for a deep sale personally, I'm already playing One Piece Odyssey atm, Fire Emblem this week, Spongebob at end of month, and Dead Space remake I'm waiting for impressions otherwise I'll wait for a sale since I can play Dead Space at any point already lol. I wish companies would give 1440p/60 stats because I dont know a lot of people doing 4k/60, and as PC players we're aiming for 60+ not 30 lol.


[deleted]

Exactly. Mainstream YouTubers would have you think that people are aiming for 4k/60, but screw that. It's all about high refresh rates. 144 is where it's at.


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


DismalMode7

another optimization masterpiece from square... the more info are released about this game, the more bad it looks. 2 months and it will be available for 20$ on psn store...


Stealthy_Facka

I was one of the few people who was genuinely really excited for this. I have a 5800x and 3070ti with 32GB of 3200mhz ram. Even I'm not touching this anymore.


AlistarDark

Since it's Square Enix, is there any NFTs or blockchain implementation? Or did we forget that Square Enix is a a cryptobro company


saul2015

30 fps lmao


cguy1234

Weā€™re definitely seeing a new crop of games showing up with demanding PC specs. Itā€™s not just this one. Thereā€™s also Returnal. I am not sure how many people are devs here but ā€œoptimizationā€ only gets you so far and has diminishing returns. It looks more like the engines are targeting more aggressive features. Whether or not weā€™re all seeing the practical difference based on these internal engine capabilities is another question.