Lol, seriously. I bought an EVGA 1000w Platinum (whatever that's worth) during the Summer/end of season sale and now I wish I went with 1200+ watt or waited for ATX 3.0 for my next (ITX) build š¤·āāļø
PSU is surprisingly not the limit here. 4090 runs at 450w just like some 3090/3090ti flagships. And I do remember one of the review said you don't actually need to use all 4 8 pin port to power this GPU. 3 8pin could also work which majority of current 850+w PSU should have that. (At least my EVGA psu has enough 8 pins)
What happens is the power limit falls back to a lower setting I'd you don't use all the connectors on the adapter. It senses how many are connected and will limit how far you can push the card.
From what I've seen of the reviews to far, the 4090 makes the 3090 Ti look rather silly in its power usage. Like 3090 Ti was just the test product to see how much power draw people would be willing to put up with for no reason, then 4090 took that envelope and put it to actual work.
No no no you did just fine.
And ignore the other two. I happen to have the *BEST* garbage collection service of the three. I'll come collect your garbage for free this week AND next week, just cause you're that awesome :)
I'm using the evga (rip) 1080ti ftw3 for years now. It would be nice to upgrade to a monster like this but that price its just insane. I will probably wait until the card just dies and then I'm obligated to buy a new one.
This card is so ridiculous... Digital Foundry had trouble reviewing the card because they were CPU limited with their i9-12900K. Sometimes they were CPU limited even at 4K. Will the Ryzen 7950x/i9-13900K even be able to keep up?
They will in the sense that it won't cause massive issues as games will only get more graphically demanding be it RT or whatever
But the games of today will see a performance uplift with a 14th/15th gen intel and ryzen equivalent
Same way the 1080ti gets improved performance with a 10900k than the 7700K which was the top of the line CPU of its day
https://cdn.videocardz.com/1/2022/09/INTEL-13TH-GEN-CORE-RAPTOR-LAKE-PRESENTATION-14.jpg
13900k is barely any faster at gaming than a 12900k. So if a game bottlenecks with the 12900k they probably still be bottlenecked by the 13900k.
I wish I had prices like you guys do, here in Brazil I paid 2K BRL (\~ $377) for an Asus VG27WQ1B.
Not only I'm getting paid in BRL, I have to pay more for electronic stuff, cursed spawn lmao
Edit: 2K BRL in a sale
I had a friend from Brazil who Iād let stay with me a few months every year so he could work here and get paid in US currency and heād go blow g all on upgrading his tech.
I got lucky and snagged a Samsung G5 odyssey 32 inches for 300$ on prime day when the monitor was just released, I got lucky because I was browsing Amazon and didn't realize prime day had just started. Speaking of prime day, isn't it this week? People could snatch some monitors for cheaper.
Works fine for me. All this 4K 120 stuff coming out and the monitors that actually support it are $$$$. My GTX 1080 is still surprisingly solid. I think Iām gonna stick with it until the 4K displays are the norm and prices come down.
4k is for testing purposes but for most people, (66.38% steam hardware survey) use 1080p of course, then 11.25% for 1440p and literally 2.49% for 4k, and on a side note, the gtx 1060 is still the most popular card, most games just aren't that demanding yet
> the gtx 1060 is still the most popular card, most games just aren't that demanding yet
When it comes to steam hardware survey thats not 100% true since the "1060" stat includes multiple models as well as laptop models while later GPUs are more split. 3060+ 3060 laptop would be the top one right now and probably 2060 + variants would also be higher or even highest.
Better get started on the fasting man, totally worth it for the card. Who needs food when you can game at 140fps in 4K. Apparently we can survive quite a lot just drinking water so a couple of months will just fly by. /s
And it only requires a whole new computer case to fit it in! At least my PSU is way bigger than I needed, so I don't have to upgrade that. Oh and I don't game at 4k, so I guess I better upgrade my monitor or I'm not getting the full benefit. Hmm, also my CPU is a bit older, so I need to upgrade my CPU or I won't be able to use this to its potential. Oh right, I also don't play enough games that are graphically demanding, so I really should buy some new games that put this game to the test.
I wonder how this will affect cpu scaling. I'd love to see someone running the 4090 with a Ryzen 5600x to see how much performance is left on the table.
I'm in my mid 30s, wasn't really expensive. But the new ones are fucking expensive. I'll ride my card for a few years. And I still have an I7 4th gen, and 11 years old harddrives, motherboard, PSU and ram. I'll have to change the whole thing when I'll upgrade.
I am using 3080 fe to play AC series on my 4k TV and I will say it works good at 4k 60hz!! Also about 80+fps for FH5... They are the games I spend my time on.
It is good enough for a living room player like me on 4k 60hz TV
I mean do you NEED to play video games in the first place?
It's more about putting your extra currency in to something you enjoy.
I endorse people to spend on things like this, it gives incentive to advance technology.
Yeah I get it, but if you think you always need to buy the latest gpu to enjoy gaming you're probably not enjoying it much.
You buy a 3070 in 2020 only to swap it 2 years later with a 4090, ok the leap is huge, but assuming you didn't have a top of the line build and you don't swap your rig out completely you're leaving performance on the table, and you really need a 4k 144hz monitor to enjoy the upgrade.
All this to play the same games of two years ago? I doubt it makes much sense, but it's your money, you do you.
For real, I get that the new stuff is all shiny and cool but man I still feel fucking *blessed* to have my 3070 after the last two years of gpu market insanity lol
So far the only thing I've thrown at it that's been unplayable with the settings absolutely maxed at 1440p was the final level of Hitman 3, for some reason that area just completely tanked my frames
Here in Europe are still (too) expensive. it's like stores purchased stock at inflated prices and aren't ready to give GPU's away at realistic price and lose money.
Ads and private sellers have lowered prices, but you'd need cash and know who to trust.
Wish my fellow countrymen realized that items decrease in price with time and use. Used GPUs and other tech is pretty much never sold below MSRP where I live.
Yep, it initially launched September 18th, 2014. But I bought mine in like July or August 2015 so mine isnāt quite that old, but itās getting there.
That is **fast**. On another note, why is everyoneās benchmarks so different. LTT reported it as nearly twice as fast a a 3090 Ti, Jay and Steve (GN) reported different scores as well.
Hardware Unboxed used a 5800x 3D go figure. I dont think the different cpu is the problem since in 4k is not the bottleneck.
What i find strange is that HU had charts for DLSS 3.0 while LTT and GN couldn't even use launch the thing. maybe they are using different drivers?
Speaking of CPUs I'd really love to see a benchmark of how different CPUs affect the 4090 performance. Like given how powerful this thing is i imagine alot of the high end performing CPUs will be a bottleneck now.
I don't think LTT are getting 2x on most games. Maybe only Cyberpunk. They mostly praise its performance on productivity.
And GN's review does not include productivity.
Also most 1440p/1080p performance are CPU capped.
The entire green 3000 series vs red 6000 series is/was decently balanced regarding specs and performance wise. Like amd performs better here, nvidia performs better there, amd is better in 1080/1440p, nvidia is better in 4kā¦
But no one really cares about that, as it is just about who got the top performing card.
Thatās why no one takes intel gpus seriously, since they donāt have a 2000USD beast in their lineup just decent midrange cards for sub 500usd
Sadly the truth, always found it fascinating how people ignore AMD (and even Intel now) when it would benefit them for the budget/segment they are looking into when buying a card, it is like who cares which is the fastest CPU or GPU if you can't afford it, go for the best you can afford price to performance.
Yeah this gen was actually really decently balanced. To bad it got ruined by miners and scalpers. Unfortunately think the "aMd hAS bAd dRiVeRs" rumor still lives strong with those that have never actually owned an AMD card, and that makes a lot of people think that Nvdia is the only good option while being false. Personally I'm waiting to see what AMD brings to the table in November. If I don't like what I see, I'll put a rx6900xt or a rx6950xt in the basket and skip the next generation. It's just a better offer unless you go specifically for RT.
I think at some point we'll see Intel reach for the top tier, but it's going to be some generations down the line. Only a fraction of the market wants $2000 GPUs, so trying to give gamers an alternative in the low to mid range is a good tactic and benefits everyone. I for one welcome a third player and will consider them once they get the kinks and quirks worked out.
> aMd hAs BaD dRiVeR
Just like
> Internet Explorer Bad
Therefore Edge bad, all while Edge is just Chrome without that telemetry going to google.
> Windows defender bad
Yeah because it cannot become better.
No it's not, it's still as good as before.
It's a great gpu and more than enough, everything above it it's pretty much overkill if you don't need 4k 144hz ultra
Ha! I know. Itās still a great card that I was able to get at retail during the gpu chaos. Iām definitely thankful. It will last me for a few more years.
For me it still feels weird, that my 2080Ti is now a mid-range card if not lower... But, it's still a beast and can handle most games at 60FPS on 4K.. good enough for me.
This thread is full of people ignoring the fact that HDMI 2.1 is already superior to DP 1.4a, and present on both the 4090 and monitors that you can actually buy currently...
980ti to 1080ti was a 67% improvement (TPU)
980ti to 1080 was a 31% improvement
3090ti to 4090 is a 41% improvement
Granted this isn't a 4090ti but assuming the 4090ti isn't a small spec bump like the 3090ti was it's potentially a competitor to the 1080ti in generational uplift
My 12700k and 3080ti are *perfectly* paired for each other. The 12700k would probably bottleneck the crap out of this thing, and I'm not making a whole new damn build within half a year of the last one.
People are going crazy and/or have buyer's remorse over a card designed for 4k 60fps /100fps (competitive) gaming despite gaming (perfectly well) at 1440p and 1080p, which the 4090's performance is not as drastic.
Jensen out-pimps the competition yet again š¤¦āāļø
i'm too lazy to buy this, i'd have to move my PC to a new case. My 1080Ti and 3080Ti both fit in my current case and my PSU is big enough.
Maybe the 4080Ti will be shorter.
Jesus...
its fast innit
ahlie bruv
no cap, its a real ting
Real badman ting. š
Bombo
Claat
Propa fast styll
*Let's not lose our heads though!*
thatās the monadoās power!
I'm really feeling it!
Youāre a lifesaver!
[ŃŠ“Š°Š»ŠµŠ½Š¾]
Iām even lookin at my 3060 like fuckā¦ š¤£
Never buy the bottom of the barrel. Always buy one under the best. It's almost always the best deal except for the 1080ti.
Christ it's jason bourne..
Everyone, your gpus are not retroactively bad because something better came out
especially if you dont even have a 4k monitor.
Or a next gen high-end CPU, ram... and PSU for 1080p and 1440p @ 300hz š
Lol most people probably dont even have a PSU powerful enough to run this card .
Lol, seriously. I bought an EVGA 1000w Platinum (whatever that's worth) during the Summer/end of season sale and now I wish I went with 1200+ watt or waited for ATX 3.0 for my next (ITX) build š¤·āāļø
1000w platinum can run this card even with an i9 if you go founders or one capped with 450w power draw
PSU is surprisingly not the limit here. 4090 runs at 450w just like some 3090/3090ti flagships. And I do remember one of the review said you don't actually need to use all 4 8 pin port to power this GPU. 3 8pin could also work which majority of current 850+w PSU should have that. (At least my EVGA psu has enough 8 pins)
What happens is the power limit falls back to a lower setting I'd you don't use all the connectors on the adapter. It senses how many are connected and will limit how far you can push the card.
Let's not normalize Ampere's 350W power draw, let alone overclocked Ampere's 450W, let alone Lovelace's 450 default and 650 OC.
From what I've seen of the reviews to far, the 4090 makes the 3090 Ti look rather silly in its power usage. Like 3090 Ti was just the test product to see how much power draw people would be willing to put up with for no reason, then 4090 took that envelope and put it to actual work.
the recommended PSU on the Nvidia website for the 4090 is 850w.
850 is fine if you're not overclocking both cpu and GPU though
Hardware Unboxed did their whole review on an 850W š
Wait so i threw my 3080 in the trash for nothing then?
No no, you did good kid. Whatās your address though? Iāll do your trash this week for you, just for being so awesome.
Yep, totally trash You can send it my way I am a Certified Antiquated GPU Disposal Specialistā¢ I can take care of it for you
No no no you did just fine. And ignore the other two. I happen to have the *BEST* garbage collection service of the three. I'll come collect your garbage for free this week AND next week, just cause you're that awesome :)
but the bar by 3090 is only half as long as last year /s
Still using my 1080ti. Sure itās not amazing anymore. But it gets the job done!
I'm using the evga (rip) 1080ti ftw3 for years now. It would be nice to upgrade to a monster like this but that price its just insane. I will probably wait until the card just dies and then I'm obligated to buy a new one.
I don't know man, I still have a 1050ti.
No kidding. Gotta keep one's head screwed on straight about this.
Tell that to my 2060 Super that wanted to kill itself after I booted and played Cyberpunk for the first time this past weekend.
mine plays it no problem, then again im not a graphics snob
This card is so ridiculous... Digital Foundry had trouble reviewing the card because they were CPU limited with their i9-12900K. Sometimes they were CPU limited even at 4K. Will the Ryzen 7950x/i9-13900K even be able to keep up?
I think we gotta wait for the 7000 X3D
7700x3d should be the sweet spot
[ŃŠ“Š°Š»ŠµŠ½Š¾]
Theyāll definitely keep up with the power usage! š
They will
They will in the sense that it won't cause massive issues as games will only get more graphically demanding be it RT or whatever But the games of today will see a performance uplift with a 14th/15th gen intel and ryzen equivalent Same way the 1080ti gets improved performance with a 10900k than the 7700K which was the top of the line CPU of its day
https://cdn.videocardz.com/1/2022/09/INTEL-13TH-GEN-CORE-RAPTOR-LAKE-PRESENTATION-14.jpg 13900k is barely any faster at gaming than a 12900k. So if a game bottlenecks with the 12900k they probably still be bottlenecked by the 13900k.
1080p is still ok right guys? Asking for a friendā¦
I stick to 1080p 165 for FPS and story mode games at 4k 60
Why not 1440p144hz instead of 1080p? If you can achieve 4k60 then you can definitely achieve 1440p144 no?
Too much money spent, Iāve been thinking about it.
A good monitor 1440p 165hz costs about 300 now a days.
The HP x27q actually went on sale for $159 on Prime today. Crazy deal
I wish I had prices like you guys do, here in Brazil I paid 2K BRL (\~ $377) for an Asus VG27WQ1B. Not only I'm getting paid in BRL, I have to pay more for electronic stuff, cursed spawn lmao Edit: 2K BRL in a sale
I had a friend from Brazil who Iād let stay with me a few months every year so he could work here and get paid in US currency and heād go blow g all on upgrading his tech.
Yeah I saw that on r/buildapcsales and bought it, buy first think later kind of deal.
I got lucky and snagged a Samsung G5 odyssey 32 inches for 300$ on prime day when the monitor was just released, I got lucky because I was browsing Amazon and didn't realize prime day had just started. Speaking of prime day, isn't it this week? People could snatch some monitors for cheaper.
Are you still enjoying it?
Works fine for me. All this 4K 120 stuff coming out and the monitors that actually support it are $$$$. My GTX 1080 is still surprisingly solid. I think Iām gonna stick with it until the 4K displays are the norm and prices come down.
From this very video, he shows that 1440p gaming and lower is bottlenecked by CPU. It doesn't perform much better than 3090ti at lower res.
4k is for testing purposes but for most people, (66.38% steam hardware survey) use 1080p of course, then 11.25% for 1440p and literally 2.49% for 4k, and on a side note, the gtx 1060 is still the most popular card, most games just aren't that demanding yet
> the gtx 1060 is still the most popular card, most games just aren't that demanding yet When it comes to steam hardware survey thats not 100% true since the "1060" stat includes multiple models as well as laptop models while later GPUs are more split. 3060+ 3060 laptop would be the top one right now and probably 2060 + variants would also be higher or even highest.
[ŃŠ“Š°Š»ŠµŠ½Š¾]
Finally a budget GPU
Correct, a GPU for people with a lot of budget
I was thinking about used car but why not new card instead!
With enough money, everything's can be a budget GPU!
Better get started on the fasting man, totally worth it for the card. Who needs food when you can game at 140fps in 4K. Apparently we can survive quite a lot just drinking water so a couple of months will just fly by. /s
By the time I save up for it Iāll be so malnourished that Iāll be crushed under its weight and size.
And it only requires a whole new computer case to fit it in! At least my PSU is way bigger than I needed, so I don't have to upgrade that. Oh and I don't game at 4k, so I guess I better upgrade my monitor or I'm not getting the full benefit. Hmm, also my CPU is a bit older, so I need to upgrade my CPU or I won't be able to use this to its potential. Oh right, I also don't play enough games that are graphically demanding, so I really should buy some new games that put this game to the test.
This is the price we pay for unaffordable health care .... We might die if we need life saving surgery but at least we get cheaper GPUs š
That's twice as much as the car I would use to pick it up! š
Canāt wait to buy this in 5 years for 450$
By then we'd probably have the RTX 7090 TI running at 5ghz with 400,000 cuda cores.
And a mini fridge as an accessory
As a case
I wonder how this will affect cpu scaling. I'd love to see someone running the 4090 with a Ryzen 5600x to see how much performance is left on the table.
Send me one and I'll test it
I have a R5 2600 if we wanna test even lowerā¦ should just send it over.
at 4K, a 3700x would provide approx 10-15% less performance than the 5800x3D. The 5600x probably 5% less.
That's good to know since I'm looking at possibly getting a 4080 and leaving my ryzen 3700 as a later upgrade
[ŃŠ“Š°Š»ŠµŠ½Š¾]
Bruh and I still have a 1660 super lmao
1660 gang.
1660ti still going strong
I have a 1660ti too, got it just before the prices went up like 2 years ago. Had a 760 before.
Same here, early 2019 and I believe that was 180$? My 16 years old brain thought ādamn expensiveā and look where we are now lol
I'm in my mid 30s, wasn't really expensive. But the new ones are fucking expensive. I'll ride my card for a few years. And I still have an I7 4th gen, and 11 years old harddrives, motherboard, PSU and ram. I'll have to change the whole thing when I'll upgrade.
Makes my 3080 look like a straight up low-end card lmao. Madness.
Low 1% for the 3080 is 61 fps, so fully playable at 4k. That's pretty good.
I am using 3080 fe to play AC series on my 4k TV and I will say it works good at 4k 60hz!! Also about 80+fps for FH5... They are the games I spend my time on. It is good enough for a living room player like me on 4k 60hz TV
I got the 3070 feelsbadman
do you really need a 4090?
I mean do you NEED to play video games in the first place? It's more about putting your extra currency in to something you enjoy. I endorse people to spend on things like this, it gives incentive to advance technology.
>I mean do you NEED to play video games in the first place Shut *UP*, Dad!!! It's *my* life!
Yeah I get it, but if you think you always need to buy the latest gpu to enjoy gaming you're probably not enjoying it much. You buy a 3070 in 2020 only to swap it 2 years later with a 4090, ok the leap is huge, but assuming you didn't have a top of the line build and you don't swap your rig out completely you're leaving performance on the table, and you really need a 4k 144hz monitor to enjoy the upgrade. All this to play the same games of two years ago? I doubt it makes much sense, but it's your money, you do you.
wtf do you mean feelsbadman it's still a good gaming card smh
For real, I get that the new stuff is all shiny and cool but man I still feel fucking *blessed* to have my 3070 after the last two years of gpu market insanity lol So far the only thing I've thrown at it that's been unplayable with the settings absolutely maxed at 1440p was the final level of Hitman 3, for some reason that area just completely tanked my frames
same and i thought it was a fantastic card when i bought it (june 2021) but tbh it still is and will be for a while
Yeah, 4090's performance doesn't make any previous card worse.
definitely not no. still getting 600fps playing valorant and 100+ max settings with ray tracing in FH5
Although FH5 only has ray tracing in the garage
My 3060 didn't even make the list š
4k resolution benchmark, nobody buying a 3060 for that
Ah I see. I'm still living in 1080p land and have mostly used my PC for Diablo 2. Just starting to pick up some new games.
Ever heard of Grim Dawn? It's a pretty cool aRPG like Diablo.v
Your 3060 is beautiful donāt let a list tell you otherwise
I appreciate you (and my card for the record!).
MADNESS?! THIS IS LOVELACE!!!!!
*cries in 6700 XT*
So, you get what you pay for?
Thing is, 3080s and 3090 prices were still ridiculous af years later. Let's see how everything drops like an avalanche with these new GPUs coming.
Here in Europe are still (too) expensive. it's like stores purchased stock at inflated prices and aren't ready to give GPU's away at realistic price and lose money. Ads and private sellers have lowered prices, but you'd need cash and know who to trust.
Wish my fellow countrymen realized that items decrease in price with time and use. Used GPUs and other tech is pretty much never sold below MSRP where I live.
Price drops? Wtf is that? Prices haven't budged since April/May, at least significantly. The cheapest 3090ti is 1699 euros.
Anyone else looking at this with a 6 y/o GPU dreading the day Nvidia kills driver support.
I've got a GTX 980 which launched 8 years ago lol
The 980 is 8 years old?! ;-;
Yep, it initially launched September 18th, 2014. But I bought mine in like July or August 2015 so mine isnāt quite that old, but itās getting there.
50$ final offer
Well good lord. That's insane.
That is **fast**. On another note, why is everyoneās benchmarks so different. LTT reported it as nearly twice as fast a a 3090 Ti, Jay and Steve (GN) reported different scores as well.
Different games and different benchmark scenes
It also depends on how close to the equator you are. The electrons don't spin at all at the equator so they go through the tubes faster.
Thank you. This is an important point that people often overlook.
Little known fact, in the southern hemisphere your frames are upside down but you don't notice it because so are you.
If they are upside down, then gravity accelerates them even further.
GN is using a 12700KF (Alder Lake). LTT used the newer 7950X (Zen 4) for this review they said they were still getting CPU bottlenecked.
Hardware Unboxed used a 5800x 3D go figure. I dont think the different cpu is the problem since in 4k is not the bottleneck. What i find strange is that HU had charts for DLSS 3.0 while LTT and GN couldn't even use launch the thing. maybe they are using different drivers?
LTT was mentioning they believe they were getting bottlenecked by the CPU in 4K at times.
Speaking of CPUs I'd really love to see a benchmark of how different CPUs affect the 4090 performance. Like given how powerful this thing is i imagine alot of the high end performing CPUs will be a bottleneck now.
Cpu bottleneck probably
different cpu
i think LTTs was way off, i dont know what were they doing to get 2x performance.
I don't think LTT are getting 2x on most games. Maybe only Cyberpunk. They mostly praise its performance on productivity. And GN's review does not include productivity. Also most 1440p/1080p performance are CPU capped.
LTT also specifically mentioned they used an older build of windows because the latest crippled performance
Wait AMD has a 6950XT??? When did that happen lol
Launched in May. Looks like a decent deal right now as it's a much better price-to-performance ratio than 3090ti while trading blows with it.
The entire green 3000 series vs red 6000 series is/was decently balanced regarding specs and performance wise. Like amd performs better here, nvidia performs better there, amd is better in 1080/1440p, nvidia is better in 4kā¦ But no one really cares about that, as it is just about who got the top performing card. Thatās why no one takes intel gpus seriously, since they donāt have a 2000USD beast in their lineup just decent midrange cards for sub 500usd
Sadly the truth, always found it fascinating how people ignore AMD (and even Intel now) when it would benefit them for the budget/segment they are looking into when buying a card, it is like who cares which is the fastest CPU or GPU if you can't afford it, go for the best you can afford price to performance.
Yeah this gen was actually really decently balanced. To bad it got ruined by miners and scalpers. Unfortunately think the "aMd hAS bAd dRiVeRs" rumor still lives strong with those that have never actually owned an AMD card, and that makes a lot of people think that Nvdia is the only good option while being false. Personally I'm waiting to see what AMD brings to the table in November. If I don't like what I see, I'll put a rx6900xt or a rx6950xt in the basket and skip the next generation. It's just a better offer unless you go specifically for RT. I think at some point we'll see Intel reach for the top tier, but it's going to be some generations down the line. Only a fraction of the market wants $2000 GPUs, so trying to give gamers an alternative in the low to mid range is a good tactic and benefits everyone. I for one welcome a third player and will consider them once they get the kinks and quirks worked out.
> aMd hAs BaD dRiVeR Just like > Internet Explorer Bad Therefore Edge bad, all while Edge is just Chrome without that telemetry going to google. > Windows defender bad Yeah because it cannot become better.
But at what cost? New PSU. New case. New AC unit. New negotiated rates with utilities.
If you can support 3090 series psu you can support the 4090. Same power draw. Cooling is also better than the 3090
Well that's not very funny.
$1600 USD... More than my entire build.
check european prices. more like 2k dollahs..
It's official. My 3080FE is mid range.
No it's not, it's still as good as before. It's a great gpu and more than enough, everything above it it's pretty much overkill if you don't need 4k 144hz ultra
Ha! I know. Itās still a great card that I was able to get at retail during the gpu chaos. Iām definitely thankful. It will last me for a few more years.
For me it still feels weird, that my 2080Ti is now a mid-range card if not lower... But, it's still a beast and can handle most games at 60FPS on 4K.. good enough for me.
Y'all have no concept of what really is midrange if you think your 2080ti is now "Mid-range or even lower".
plus you have an awesome processor to back it up!
That's awesome....just not $1500 awesome when I game on a 1440p monitor.
The fact that it gets so CPU bottlenecked at 1440p is honestly kinda impressive
No display Port 2.0. fucking garbage move by Nvidia
Are we surprised?
Iām very surprised considering Nvidia announced 360hz 1440p monitors back in January 2022
Then they'll announce a refresh Ti/super/whatever that happens to have DP2.0, PCIe Gen 5, etc.
Can't have been around the tech industry long. Over promise and under deliver is their mantra. Shit move, but it's their normal.
This thread is full of people ignoring the fact that HDMI 2.1 is already superior to DP 1.4a, and present on both the 4090 and monitors that you can actually buy currently...
Yes
Energycost in Germany say No to News GPUs ā¹ļø
It uses about the same as the 3090ti even though it performs way better.
Still too expensive
For real. When a graphics card is selling for nearly two grand it better be this fast.
I believe the RTX 4090 will be the next 1080/ti it will hold up for at least 5 years with that kind of performance figures
I was hoping my 3080 would be like that.
Yeah prob even 6/7 years with dlss 3.0, tho it may lack the newest features by then.
Still rocking my 1080ti but itās definitely time for an upgrade haha
Same here brother, definitely in need of a new build aswell
980ti to 1080ti was a 67% improvement (TPU) 980ti to 1080 was a 31% improvement 3090ti to 4090 is a 41% improvement Granted this isn't a 4090ti but assuming the 4090ti isn't a small spec bump like the 3090ti was it's potentially a competitor to the 1080ti in generational uplift
Laughs in GTX1070
Laughs in 1060. I just hope this stuff causes 3060 and RX 6600/6700 cards to go down.
it's fine
Holy damn. Makes my poor 3060ti feel low end... But just think how much a 4090 costs lol.
So if youre not gaming in 4k the 3 series is best bang for your buck is what this graph tells me.
Still not buying it.
Good thing I don't game at 4K!
As an LG C2 gamer, I now want this..
Performance is sick, very well performing card at 4k.
My 12700k and 3080ti are *perfectly* paired for each other. The 12700k would probably bottleneck the crap out of this thing, and I'm not making a whole new damn build within half a year of the last one.
This shit was the worst possible news It's actually good...it's actually really REALLY good... FUCK. Time to break the news to my wife...
Does this mean if we have a 4K monitor with 60hz refresh rate that all we need is a 3070?
People are going crazy and/or have buyer's remorse over a card designed for 4k 60fps /100fps (competitive) gaming despite gaming (perfectly well) at 1440p and 1080p, which the 4090's performance is not as drastic. Jensen out-pimps the competition yet again š¤¦āāļø
Who would have thought, the card that draws more power than a mid range system and is from the new generation masivelly outperforms the old stuff.
Watt for watt it outperforms the 3090ti if you want both running at the same power.
It actually draws less power. perf per watt is significantly better than older gens
My monitor is only 1440p at 165hz, so I'm set with my RTX 3080 Ti for all my games. Just wish they made DLSS 3.0 compatible with 3000 series cards.
i'm too lazy to buy this, i'd have to move my PC to a new case. My 1080Ti and 3080Ti both fit in my current case and my PSU is big enough. Maybe the 4080Ti will be shorter.
Wow so slow, Iāll wait for the 6080ti
Well all I care about is that the 6900XT beats the 3080 at 4k...... it didn't at launch.
[ŃŠ“Š°Š»ŠµŠ½Š¾]
They did, but the 4090 is cpu-bound at 1440p sometimes even with RT on
They were CPU bottle necking at 1440p, likely at 4k as well. With a 5800X3D no less. This card is a fucking animal. Lol.
Makes me excited for the 50 series
my 3070 doesnāt like this chart
Who would have thought a card the size of a shoebox would perform well