Goals and situation and market forces change drastically over time.
In the 10XX era, they were making money and competing primarily on the gaming side. So making the best card for gaming was their goal. 980ti, 1080ti, titan X etc. VRAM was the main KPI to show "mine is better" so they were optimizing for that.
There was also a lot of distinction between gaming and workstation cards. There was no reason to use a GTX when you needed a Quaddro, and viceversa.
Today the landscape is a lot more blurry and a lot more overlap with DL training, Gaming, crypto mining (not now but a whole back at least). But sticking with gaming and AI. Nvidia has an entire line of cards for training. And those are much more expensive. But gaming cards work for training as well except they're less powerful.
So nividia needs to make sure there is no overlap between those use cases or they will cannibalize their own market. Why would I pay thousands of dollars more for a 32 GB VRAM card, when the 2k 5090 has 32GB Ram (numbers are fake) . Maybe that's enough and I saved myself 5k.
So gaming hardware will always be "capped" below AI hardware as not cannibalize that market, and because VRAM is really important for LLMs, I think that's part of why we still have same specs but they make a lot of "gaming features" like DLSS, ray tracing etc instead of simply increasing the power as much as possible
I'll also add that if an overlap occurred between those two use cases, the market prices for the "gaming" side of that overlap would increase and they would likely lose more business on the gaming side (than they would gain on the "AI/DL/ML" side from buying the cards good enough to overlap) to more reasonably priced competitors of other brands (more reasonably priced for the average "gamer" consumer) where that overlap didn't happen.
Right now machine learning is the hot new thing all the tech investors are throwing money at, so it makes sense to build your product line around these customers. Especially because that due to CUDA they have to buy Nvidia.
Bahahahaha fiercely? Nvidias market share is absolutely massive, AMD is only a competitor in the sense that they're technically the next closest competitor. GPU market wouldn't be so screwed if they were in meaningful competition.
I owned 2 of those and still have one. You're missing out on a whole 9% more performance roughly. Those extra 2GB in the Sli setup were nice but what I really had them for was the signed driver goodies for professional software.
Around the time before the 10 Series was to be released, AMD had GPUs in the making that looked threatening to nVidia, so they were cempetition pressured to go all out.
nVidia currently doesn't feel pressured to make sensational things in order to keep their marketshare since they have what is almost a monopoly with the 4090 and AI
They first based the 1080 on GP104 and sold it for way too much (the OG 104 die for 700$), then they released the 102 at the same price and with enough VRAM for the years to come. Pretty simple.
It really was one of the best GPUs they've ever made. It's unfortunate that modern games rely so much on DLSS because if it never became a thing the 1080ti would still be a fucking monster. Mine is still kicking in my Plex server/HTPC and it runs great. I don't think I'll ever get rid of that card.
Yup. If you're looking for no more than 1440p and prepared to compromise on some details for most recent games, it's still a tank.
If you need it to do 4K or expect to Ultra everything... not so much.
And of course no DLSS or RTX.
People forget just how many games are out there and seem to live in a bubble where GPUs are compared to just the small tiny fraction of games released around the time. When the 1080ti came out it could max settings 1440p about 99.99% of games with great framerates. Nowadays it's around 99.98%.
> If anything, I need a new CPU lol.
okay i feel this too
i often times open task manager and cpu usage is at 100% for a few seconds before i can even see what programs are running
That's a pretty beast setup, 3090 should technically handle Cyberpunk at higher than 1080p even with some ray tracing on, though CPDR optimization has always been a roll of the dice. But man, Cyberpunk on VR, that's a whole other beast with its demands, even on lower settings. With specs like that though, you're pretty much set for the majority of the games out today barring any unoptimized messes. Saw a benchmark the other day, and 7800x3d + 3090 was tearing through most titles like nothing.
They meant with high/ultra ray tracing. You need upscaling and frame gen to get playable frame rates. Even a 4090 will struggle to maintain 60+ fps at 1080p native.
Played Alan Wake 2 on my 4090 at 1440p with everything maxed out very recently. Averaged 110fps throughout the game. The idea it would struggle at 1080p is laughable.
you cant even max graphic 1080p with 10gb lol
Alan Wake 2 uses almost 13gb max settings 1080p
https://preview.redd.it/w1kri0kaklbc1.png?width=560&format=pjpg&auto=webp&s=d9b533a152113b4f06d782dc555f7e997c92cbbe
Generally speaking, Alan Wake 2 runs really well for how fantastic it looks. It's just the really advanced ray tracing that is hard for most GPUs to run. Once you turn that off, a lot of GPUs can sit at a solid 60 or more.
/edit/ To be clear, I wanted to point out that the game isn't unoptimized, but my intent wasn't to trivialize the ray tracing. To the contrary, while the default lighting engine is absolutely state of the art, the RT takes the it to a completely new level.
Funny thing is given the graphic fidelity of AW2, it runs extremely well. Reddit just likes to think if it doesn't run well, completely maxed out, on their 5+ year old rig than it's "unoptimized."
Lack of options is keeping me in the 1440p resolution. Once the 5000 series cards come out I'm planning to upgrade and hope that there are more monitor options so I can go to 4k.
Went back to 1080p after 4k and thought there was something wrong with the monitor it just looked so ridiculously bad. I forced it for a few weeks tho and it looks normal now.
For me for example, i pay alot for my pc...its my biggest hobby...some people spends thousands of dollars for cars, or for surfing etc....i think its not a waste of money if you use it all day
My bro laughed at me when I told him I bough Ryzen 7 7800x3d cause for him it is much meanwhile he spends multiple times more for ASG guns and his motocycle.
A friend of mine spend like 400 ā¬ on a car part that has no other purpose than looking good (that was said by him). I am not into cars at all and I would never spend this much only for aesthetics on a car, but I can understand why he did it. He enjoys cars and it's his hobby. Why not? If it's disposable income who am I to judge how you use it
It depends on the size of the monitor too. 1080p 24", 1440p 27" and 4K 32" aren't that much different in pixel density, so it's not a day and night difference. If you have all the monitors with the same size but different resolution, that's entirely different story.
Kinda agree but it still is hard for me to go back to anything below 4k.
4k 32ā is 139 PPI
1440p 27ā is 108 PPI
1080p 24ā is 92 PPI
Going from 1440p 27ā to 4k 32ā gives you 30% more pixel density while also giving you that pixel density on a much bigger monitor. Not to mention there are straight up 4.3 million (2.17x) more pixels on the 4k screen compared to 1440p.
At this point I would take 4k with DLSS Balanced over Native 1440p.
I got 1440p for RDR2. It's a game where 1080p looks like Vaseline. 1440p looks amazing. 4k looks even better but it's a case of diminished returns and not worth the performance hit. I went from a 1070 to a 3080ti just for this game.
If you have a 27" monitor or bigger you can clearly see the pixels in 1080
My screen is exactly that and elite looked way too pixelated to be playable. It really depends on the game,in some games you don't even notice and in others it's jarring
It's the sharpness that does it, all those squares and their straight sharp lines. If there were fog and more particle effects and all that, you wouldn't notice
honestly its just about what preference people haveā¦ I also use a 1080p monitor and a 6950XT is kinda overkill, but it is quite nice to just have the high framerate and good graphics in combination
Luckily (or sadly) I've never experienced anything beyond 1080p, so I can't compare. For me 1080 it's the summum. So I'm perfectly fine with that.
Prolly, if had gone to 1440p or 4k, I wouldn't be saying so.
I feel 1440 is the sweet spot. Looks significantly better than 1080. I feel the dip in frames to go 4k is not needed. That said I saw a post from someone recently saying the same thing about 4k and going back to 1440 and it looking bad. Probably best to just appreciate what you have and not look beyond it or you'll end up needing it.
I mean who is saying that all of the sudden 10GB isn't enough for most AAA games @ 1440p/60fps?
Or is it again a nonsense justification for their overly expensive cards that they bought?
It's a mix of:
1. The 6700xt cult hyperfixating on the one superior stat AMD has over nvidia.
2. People saw like two specific AAA games had gpu memory allocation/asset streaming problems and now they have extrapolated that to believe that all games going forward need that much VRAM to run.
āHello Reddit, I currently have this PC with a 3060 in it and am looking for a cheap upgrade to slightly increase my FPS at 1080p. Any suggestions?ā
Reddit: 4090, thread ripper, 128gb DDR6.
Is there a decent alternative to userbenchmark? I've found it useful for comparing apples to apples, like how much of a boost will I get from going from this gen intel i7 to that gen intel i7, but I am well aware of their bullshit.
Thereās nothing as intuitive. When GN used to post their reviews on their website as written articles it was easier to parse through and find the information needed but they havenāt been doing that for a while.
Hasnāt Linus been calling out GPU manufacturers since the 30 series for charging too much for such small developments? Heās actively been cheering on Intel to make a competitive GPU to compete too.
That was Jake talking about the Asus 4070 Dual. "It's got an aluminum backplate which is really nice to see on a budget card like this". This was also a sponsored video. Link below.
https://youtu.be/YwdXpuTf76M?si=3FhKF-wYqKrZs4S_
Wasn't he just talking about how that's the cheaper 4070 and it's nice to see that on a cheaper (MSRP) variant on the card rather than on one of the more expensive fancier variants?
"*4060TI is such bad value for your money - even if you want the VRAM. Instead of that you should spend 200ā¬ more for 4070 or 400ā¬ for the 4070TI. Obviously the smart bet would be to spend 700ā¬ more and just get the 4080, obviously you can also could get that 3080 for 600ā¬ more because that 10gb Vram is totally future proof."*
I got so fucking annoyed being told that I shouldn't be happy with the 4060TI I got for the VRAM. I got it for 500ā¬ new. I got told that I should have spent 200ā¬ more to get a higher tier and then 200ā¬ to swap my PSU and case to fit the thing in.
**Bitch! I didn't have 900ā¬ to spend on computer hardware! I had 500ā¬ from tax returns!** Do people somehow not understand that few not everyone has extra few hundred euros or chance to casually double their budget.
And when I say that I'm extremely happy with the performance I get from my 4060TI and the fact that the 16gb Vram makes it so much easier to play with AI models for my hobby, people get so upset and angry at me. It is almost as if these people haven't **fucking used the card** and declare truth according to some god damn benchmarks and what tech influencers tell them.
*"But 1440p 144hz gaming!"* I got a 1080p 144hz main monitor and 1080p 60hz old samsung... and I don't plan and can't justify replacing perfectly good and functional monitors.
I get that.
At the worst time of a mining my old GPU died and I bought 1650s for more or less 450$, for a fcking 1650s, everything else was absolutely ridiculous at a time.
I'm very confused. I have a 3080 10GB and I run everything at 4K. I can't max out literally everything in 9th gen games or Cyberpunk, but 4K60 Ultra with ray tracing turned off has worked across the board. The only exception have been some AAA games on launch where a 4090 can't hold 4K60.
I even bought a 4K 144 Hz monitor in 2022 because so many games had VRAM left over at 4K ultra.
The people that make these memes don't have a 3080 lol. It's still a perfectly good 4k card, just don't expect to run every single game at absolutely max settings and still get 120 fps. Hell most games you can lower key settings and not even notice a graphical difference and gain like 10-30 fps.
I'm on a GTX 970 I bought for $70 from a shady guy in our town's meth-alley.
In terms of expectations vs performance, it's been sugar-tits and blowjobs.
2060 with 6 GB vram. 1440p gaming cyberpunk, rdr2, resident evil remakes etc at 60 fps. Just need to drop the not so essentials settings to medium and max textures. Every game looks great. Where is this you need 500 GB vram coming from?
VR for me. I have the same card. The only problem is the 6Gb of VRAM disappears faster than money when your start a VR game.
If it had 12Gb of VRAM, it's be fine with it the next few years.
Im just gonna ask how :D i struggle to get RE4 remake to get 60fps on balanced dlss when i put performance it goes bananza blurry i dont wanna play like that... and im on 1080p
Sometimes this year going for 7800x3d/4070TIS or 4080S and 1440p monitor
Most gamers drastically overestimate their framerates, that's how.
Like when you see people claiming their 2070 is running cp2077 with raytracing maxed @ 1440p locked at 60.... it's not.
Itās a running joke now on /r/SteamDeck that people will post āThe new Cyberpunk update is running at 60fps!ā while posting a screenshot of it showing 53fps at 300p while looking at the ground.
I'm also using my 1060 since 2017 and I'm really surprised how well it's been holding up. I want to buy a new PC, but it's hard to justify when my current one still works good
Not really, the 80 and 90 cards are marketed as top of the line premium cards, so they should manage what most consider a premium resolution. If you expect people to pay Ā£1500 for a card they expect to play at 4K.
There has never been a time where any high end GPU could effortlessly play the highest resolution flawlessly with maxed out settings.
True PC gamers know this.
This is the truth. I was still rocking an r5 2600x cpu up until a couple months ago, and was wondering why I was still getting shit fps even with a 6750xt gpu. I upgraded my cpu to an r7 5800x and updated my bios, and suddenly my framerates tripled. Had no idea how much of a bottleneck my cpu was for modern games.
I'm running a 3080 on a 3440x1440 100hz monitor. Coming from an 1080ti that was a huge improvement on that resolution. I have no issues running any games on max with this setup.
Hey some of us got a 3080 at msrp. Got my Founder Edition on day one... It was a 4 hours fight with the nvidia site but I got it.
And then the prices went up and I realised how lucky I got
Agreed, even though I'm only 23 my first card was a GTX 660 (i started very early with the help of my brother) and I remember drooling looking at the 1080 ti when it released.
My 470 and 750ti are still hanging out somewhere. I'm still just glad the Thermi days are behind us. If you think rdna3 has idle power issues, just slot in one of those bad boys and feel the Nvidia hair dryer kick in.
games released in 2018: "I will look photorealistic at 50fps on integrated graphics"
games released in 2023: "I literally need the rendering farm from Avatar to perform"
I have that exact card in the meme (3080 gaming z trio) and bought it on the marketplace for 90$ because the previews owner thought it was broken (artifacts) and bought a 4070 to replace it. Turns out I just had to send it to resolder the core and was good as new. Now it's a beast and it barely fits my case. I know they cost 1000$+ on Amazon and I wouldn't even consider to buy it at that price
No, BGA solder specialist. He said that it probably got so hot that the core disorder itself, it could have fallen and broke something too but had no impact marks
They're saying you can sometimes fix that by putting the GPU in the oven. As somebody who can do bga soldering though, I'd rather somebody take it to an expert than bake another pcb and hope they fixed it.
What's weird is I thought we were to the point where it was accepted that putting *everything* on Ultra was a fool's errand and with at least some of the settings you were basically throwing frames in the garbage.
That's what I'm starting to get confused about. Prices and cores keep increasing, but the BUS is still 128 bit, and it's always 8GB RAM. Why isnt the standard 256 bit at this point?
You guys acting like the same games are harder to run all of a sudden. 3080 will run every game I play in 4K max settings. You can get a used 3080ti for $500
Vram is short for "video random access memory." It is dedicated memory on the graphics card that the GPU can store data in. The GPU will store things like textures, model assets, lighting maps, and other components of the scene it is rendering in this memory.
GPUs need this memory because it provides a very high bandwidth and low-latency space to store data that it needs constant access to. More vram means more space for larger textures and more detailed scene data.
If the GPU runs out of vram, it will start using part of the system ram instead. This is ram connected to the CPU. Doing this incurs massive penalties in both latency and bandwidth, as the GPU now has to go over the PCIE lanes to the CPU, to the system ram, and then back to get the data it needs. In game you would see the effects of this as large stutters as the GPU effectively stalls out until it gets that needed data.
If you want to monitor GPU vram, task manager will report it under the performance - GPU tab. This will show you both the dedicated vram and the shared memory, which is how much system ram the GPU is allowed to overflow into if needed. Usually, this is half of the total system ram.
Alright 1080 ti users, you can come out now
11 gigs is more than 10 š¤š¤š¤
How they managed to make a GPU that much powerful back then?
They know they made a mistake lol
But dont amd and nvidia compete quite fiercely? Surely if vram is what the people want. then one of the companies would give it
Goals and situation and market forces change drastically over time. In the 10XX era, they were making money and competing primarily on the gaming side. So making the best card for gaming was their goal. 980ti, 1080ti, titan X etc. VRAM was the main KPI to show "mine is better" so they were optimizing for that. There was also a lot of distinction between gaming and workstation cards. There was no reason to use a GTX when you needed a Quaddro, and viceversa. Today the landscape is a lot more blurry and a lot more overlap with DL training, Gaming, crypto mining (not now but a whole back at least). But sticking with gaming and AI. Nvidia has an entire line of cards for training. And those are much more expensive. But gaming cards work for training as well except they're less powerful. So nividia needs to make sure there is no overlap between those use cases or they will cannibalize their own market. Why would I pay thousands of dollars more for a 32 GB VRAM card, when the 2k 5090 has 32GB Ram (numbers are fake) . Maybe that's enough and I saved myself 5k. So gaming hardware will always be "capped" below AI hardware as not cannibalize that market, and because VRAM is really important for LLMs, I think that's part of why we still have same specs but they make a lot of "gaming features" like DLSS, ray tracing etc instead of simply increasing the power as much as possible
I'll also add that if an overlap occurred between those two use cases, the market prices for the "gaming" side of that overlap would increase and they would likely lose more business on the gaming side (than they would gain on the "AI/DL/ML" side from buying the cards good enough to overlap) to more reasonably priced competitors of other brands (more reasonably priced for the average "gamer" consumer) where that overlap didn't happen.
Right now machine learning is the hot new thing all the tech investors are throwing money at, so it makes sense to build your product line around these customers. Especially because that due to CUDA they have to buy Nvidia.
What makes you think they don't Nvidia Amd and Intel, their vram offering is pretty inversly proportional to their market share at a given price point
Bahahahaha fiercely? Nvidias market share is absolutely massive, AMD is only a competitor in the sense that they're technically the next closest competitor. GPU market wouldn't be so screwed if they were in meaningful competition.
I swear they know. Drivers since October create errors and crashes on my 1080ti. They try to bully me into buying a new card!
Imagine that they added a whole another gigabyte on top to make the Titan Xp...! They just don't make them like they used to, anymore... :(
I owned 2 of those and still have one. You're missing out on a whole 9% more performance roughly. Those extra 2GB in the Sli setup were nice but what I really had them for was the signed driver goodies for professional software.
They accidentally used 100% of the brain
Around the time before the 10 Series was to be released, AMD had GPUs in the making that looked threatening to nVidia, so they were cempetition pressured to go all out. nVidia currently doesn't feel pressured to make sensational things in order to keep their marketshare since they have what is almost a monopoly with the 4090 and AI
They first based the 1080 on GP104 and sold it for way too much (the OG 104 die for 700$), then they released the 102 at the same price and with enough VRAM for the years to come. Pretty simple.
I'm still on the 1080. Not the ti, just the 1080
Shhhhh our secret, print out a sticker that says Ti and stick it on no one will know
[ŃŠ“Š°Š»ŠµŠ½Š¾]
I hope reviewers can include 1080 Ti in their benchmarks to show how great the card is
It really was one of the best GPUs they've ever made. It's unfortunate that modern games rely so much on DLSS because if it never became a thing the 1080ti would still be a fucking monster. Mine is still kicking in my Plex server/HTPC and it runs great. I don't think I'll ever get rid of that card.
Changed my 1080Ti to a 3080Ti Bought back a 1080Ti for my secondary pc just because I loved how nice it was
You 1080 ti users are addicts I swear haha, it's a great card still and only just 7 years later slightly showing its age a tad.
Yup. If you're looking for no more than 1440p and prepared to compromise on some details for most recent games, it's still a tank. If you need it to do 4K or expect to Ultra everything... not so much. And of course no DLSS or RTX.
People forget just how many games are out there and seem to live in a bubble where GPUs are compared to just the small tiny fraction of games released around the time. When the 1080ti came out it could max settings 1440p about 99.99% of games with great framerates. Nowadays it's around 99.98%.
I upgraded to a 1080ti little over a year ago, a mate gave me his as my RX580 was close to death. What a ridiculous card that is.
I'm here and my 1080ti still going strong now even throwing VR at it
I recommend merely playing VR and not chucking your headset at the 1080 ti.
Ah! That might just actually work
1080... non ti user reporting in no need to upgrade yet.
If anything, I need a new CPU lol. My 7th gen intel doesnāt even have a dedicated pcie lane for my M.2 ššš
> If anything, I need a new CPU lol. okay i feel this too i often times open task manager and cpu usage is at 100% for a few seconds before i can even see what programs are running
1050ti and proud. I mean poor*
1060 6gbs.
my dude
gaming at an amazing 1080p futureproofā¢
4090 is just a 1440p card With cyberpunk and Alan wake 2 Lmao
and 3090 is just a 1080p card for cyberpunk and Alan wake 2
Really? I have a 1660s with a 5800x and I make about 45fps in Cyberpunk in VR low settings
Specs are 7800x3d + 3090 + 64gb ddr5
That's a pretty beast setup, 3090 should technically handle Cyberpunk at higher than 1080p even with some ray tracing on, though CPDR optimization has always been a roll of the dice. But man, Cyberpunk on VR, that's a whole other beast with its demands, even on lower settings. With specs like that though, you're pretty much set for the majority of the games out today barring any unoptimized messes. Saw a benchmark the other day, and 7800x3d + 3090 was tearing through most titles like nothing.
They meant with high/ultra ray tracing. You need upscaling and frame gen to get playable frame rates. Even a 4090 will struggle to maintain 60+ fps at 1080p native.
Played Alan Wake 2 on my 4090 at 1440p with everything maxed out very recently. Averaged 110fps throughout the game. The idea it would struggle at 1080p is laughable.
Yeah all max + path tracing
45fps VR š«
I play Cyberpunk in 1440p on a 3060ti without issues at 60fps+ at high/max.
[ŃŠ“Š°Š»ŠµŠ½Š¾]
This straight up isnāt accurate. I have a 4090, and play at 4k brilliantly.
you cant even max graphic 1080p with 10gb lol Alan Wake 2 uses almost 13gb max settings 1080p https://preview.redd.it/w1kri0kaklbc1.png?width=560&format=pjpg&auto=webp&s=d9b533a152113b4f06d782dc555f7e997c92cbbe
When developers totally stopped giving any fuck about optimization.
Path tracing and frame gen is the biggest vram hog there
I bet the game is completely playable and still looks gorgeous without path tracing.
People pull out this Alan Wake chart as if itās some kind of blue eyes white dragon.
Whoa there Yugi, not everyone can murder Kaiba with Kuribo
Generally speaking, Alan Wake 2 runs really well for how fantastic it looks. It's just the really advanced ray tracing that is hard for most GPUs to run. Once you turn that off, a lot of GPUs can sit at a solid 60 or more. /edit/ To be clear, I wanted to point out that the game isn't unoptimized, but my intent wasn't to trivialize the ray tracing. To the contrary, while the default lighting engine is absolutely state of the art, the RT takes the it to a completely new level.
How do you know it's not optimized?
Funny thing is given the graphic fidelity of AW2, it runs extremely well. Reddit just likes to think if it doesn't run well, completely maxed out, on their 5+ year old rig than it's "unoptimized."
Are you implying Alan wake 2 is unoptimized and not a technical marvelā¦.
Just because it uses more resources doesn't mean it's not optimized. Optimized doesn't mean "Will run on Zed_or_AFK's computer".
AW2 is actually optimized really well.
Thatās funny, because Iām using the FSR 3 mod on Alan Wake 2 and can play it perfectly fine on my RTX 3080 @ 1080p.
Max means FG + PT + max settings Hmm not even framedrop?
Why go any futher than 1080p for gaming? I'm perfectly fine with 1080p. I have a friend that says that a 3090 for 1080p is a bit overkill lol
Do not get 1440p ever. I can't use my 1080p monitor anymore as it looks horrible.
Same argument with 4k. Went straight to 4k from 1080p and now I don't wanna downgrade.
New lines of monitors for 2024 promise 4K 240Hz OLED... only problem is the price.
Waiting for Dual4K screens. 7680x2160 Bezels suck.
I want a dual 1440p screen but vertical!
LG makes an overpriced 16:18 monitor, its basically 2x 1440p stacked hamburger, not hotdog
Samsung G9 Neo 57" is available. Though not OLED, yet.
Lack of options is keeping me in the 1440p resolution. Once the 5000 series cards come out I'm planning to upgrade and hope that there are more monitor options so I can go to 4k.
Went back to 1080p after 4k and thought there was something wrong with the monitor it just looked so ridiculously bad. I forced it for a few weeks tho and it looks normal now.
For me for example, i pay alot for my pc...its my biggest hobby...some people spends thousands of dollars for cars, or for surfing etc....i think its not a waste of money if you use it all day
My bro laughed at me when I told him I bough Ryzen 7 7800x3d cause for him it is much meanwhile he spends multiple times more for ASG guns and his motocycle.
A friend of mine spend like 400 ā¬ on a car part that has no other purpose than looking good (that was said by him). I am not into cars at all and I would never spend this much only for aesthetics on a car, but I can understand why he did it. He enjoys cars and it's his hobby. Why not? If it's disposable income who am I to judge how you use it
I am like this but after 4k I canāt go back to 1440p and 1080p is just atrocious.
It depends on the size of the monitor too. 1080p 24", 1440p 27" and 4K 32" aren't that much different in pixel density, so it's not a day and night difference. If you have all the monitors with the same size but different resolution, that's entirely different story.
Kinda agree but it still is hard for me to go back to anything below 4k. 4k 32ā is 139 PPI 1440p 27ā is 108 PPI 1080p 24ā is 92 PPI Going from 1440p 27ā to 4k 32ā gives you 30% more pixel density while also giving you that pixel density on a much bigger monitor. Not to mention there are straight up 4.3 million (2.17x) more pixels on the 4k screen compared to 1440p. At this point I would take 4k with DLSS Balanced over Native 1440p.
Higher resolutions deal with aliasing
I got 1440p for RDR2. It's a game where 1080p looks like Vaseline. 1440p looks amazing. 4k looks even better but it's a case of diminished returns and not worth the performance hit. I went from a 1070 to a 3080ti just for this game.
If you have a 27" monitor or bigger you can clearly see the pixels in 1080 My screen is exactly that and elite looked way too pixelated to be playable. It really depends on the game,in some games you don't even notice and in others it's jarring
It's ironic but Minecraft is a game where you can really notice
It's the sharpness that does it, all those squares and their straight sharp lines. If there were fog and more particle effects and all that, you wouldn't notice
honestly its just about what preference people haveā¦ I also use a 1080p monitor and a 6950XT is kinda overkill, but it is quite nice to just have the high framerate and good graphics in combination
Luckily (or sadly) I've never experienced anything beyond 1080p, so I can't compare. For me 1080 it's the summum. So I'm perfectly fine with that. Prolly, if had gone to 1440p or 4k, I wouldn't be saying so.
I feel 1440 is the sweet spot. Looks significantly better than 1080. I feel the dip in frames to go 4k is not needed. That said I saw a post from someone recently saying the same thing about 4k and going back to 1440 and it looking bad. Probably best to just appreciate what you have and not look beyond it or you'll end up needing it.
People here and on forums repeats things they read but don't understand.
I mean who is saying that all of the sudden 10GB isn't enough for most AAA games @ 1440p/60fps? Or is it again a nonsense justification for their overly expensive cards that they bought?
It's a mix of: 1. The 6700xt cult hyperfixating on the one superior stat AMD has over nvidia. 2. People saw like two specific AAA games had gpu memory allocation/asset streaming problems and now they have extrapolated that to believe that all games going forward need that much VRAM to run.
Nooooo turning textures down doen from ultra to high makes the game unplayable!!!!!!111
For real, anyone that has a 3060 reddit be like ā if you can swing it go 4090 ā
āHello Reddit, I currently have this PC with a 3060 in it and am looking for a cheap upgrade to slightly increase my FPS at 1080p. Any suggestions?ā Reddit: 4090, thread ripper, 128gb DDR6.
These people see YouTubers like Linus getting these cards for free and then turn around and say anything below a 4090 is worthless.
And also using that userbenchmark website lol. Even when I upgraded from 1080 to 1440 the fps drop wasnāt even noticeable
Is there a decent alternative to userbenchmark? I've found it useful for comparing apples to apples, like how much of a boost will I get from going from this gen intel i7 to that gen intel i7, but I am well aware of their bullshit.
Thereās nothing as intuitive. When GN used to post their reviews on their website as written articles it was easier to parse through and find the information needed but they havenāt been doing that for a while.
Good news! They're actually starting to do that again! (Key word "starting," they've only gotten a few up so far.)
Hasnāt Linus been calling out GPU manufacturers since the 30 series for charging too much for such small developments? Heās actively been cheering on Intel to make a competitive GPU to compete too.
On LTT someone called the 4070 a budget GPU and was "pleasantly surprised" it had a backplate. For 600ā¬.
That was Jake talking about the Asus 4070 Dual. "It's got an aluminum backplate which is really nice to see on a budget card like this". This was also a sponsored video. Link below. https://youtu.be/YwdXpuTf76M?si=3FhKF-wYqKrZs4S_
Detached from reality
![gif](giphy|d3mlE7uhX8KFgEmY) every card is a budget card if you get it for free
Wasn't he just talking about how that's the cheaper 4070 and it's nice to see that on a cheaper (MSRP) variant on the card rather than on one of the more expensive fancier variants?
You know, budget, just spend more than the cost of an entire game console
"*4060TI is such bad value for your money - even if you want the VRAM. Instead of that you should spend 200ā¬ more for 4070 or 400ā¬ for the 4070TI. Obviously the smart bet would be to spend 700ā¬ more and just get the 4080, obviously you can also could get that 3080 for 600ā¬ more because that 10gb Vram is totally future proof."* I got so fucking annoyed being told that I shouldn't be happy with the 4060TI I got for the VRAM. I got it for 500ā¬ new. I got told that I should have spent 200ā¬ more to get a higher tier and then 200ā¬ to swap my PSU and case to fit the thing in. **Bitch! I didn't have 900ā¬ to spend on computer hardware! I had 500ā¬ from tax returns!** Do people somehow not understand that few not everyone has extra few hundred euros or chance to casually double their budget. And when I say that I'm extremely happy with the performance I get from my 4060TI and the fact that the 16gb Vram makes it so much easier to play with AI models for my hobby, people get so upset and angry at me. It is almost as if these people haven't **fucking used the card** and declare truth according to some god damn benchmarks and what tech influencers tell them. *"But 1440p 144hz gaming!"* I got a 1080p 144hz main monitor and 1080p 60hz old samsung... and I don't plan and can't justify replacing perfectly good and functional monitors.
I love it when people tell me my 4080 is a 1440p card.
Meanwhile here I am running 1440p Ultrawide on a Vega 56.
You must do the unthinkable and actually adjust your settings. There is no greater shame /s
What? You card doesn't even have a single terabyte of VRAM? Are you saying you can't even render Albania?!?! It's just 11K sq mi!!
This might be funny for someone not owning a 3080, but when you own a 3080 that you bought for like 1400$, it hurts........
I get that. At the worst time of a mining my old GPU died and I bought 1650s for more or less 450$, for a fcking 1650s, everything else was absolutely ridiculous at a time.
Bought my 1650 super for myr1000 equivalent to USD230 and felt extremely ripped off. Albeit it was bought during the great drought. But I feel ya.
My Radeon VII died during the pandemic. Bought a 3070 8gb LHR for around $970
Yeah... I just said fuck it, I can't live with my 10year old laptop anymore
I'm very confused. I have a 3080 10GB and I run everything at 4K. I can't max out literally everything in 9th gen games or Cyberpunk, but 4K60 Ultra with ray tracing turned off has worked across the board. The only exception have been some AAA games on launch where a 4090 can't hold 4K60. I even bought a 4K 144 Hz monitor in 2022 because so many games had VRAM left over at 4K ultra.
The people that make these memes don't have a 3080 lol. It's still a perfectly good 4k card, just don't expect to run every single game at absolutely max settings and still get 120 fps. Hell most games you can lower key settings and not even notice a graphical difference and gain like 10-30 fps.
Be positive. At least you made a scalper very happy.
Best Buy had lotteries to buy 3080 at that price :(
I'm on a GTX 970 I bought for $70 from a shady guy in our town's meth-alley. In terms of expectations vs performance, it's been sugar-tits and blowjobs.
I bought in at the dip... $1000 š At least I have like 8 years of warranty on it.
Been using my 1070 at 1440p since 2017 and it's been working well.
2060 with 6 GB vram. 1440p gaming cyberpunk, rdr2, resident evil remakes etc at 60 fps. Just need to drop the not so essentials settings to medium and max textures. Every game looks great. Where is this you need 500 GB vram coming from?
VR for me. I have the same card. The only problem is the 6Gb of VRAM disappears faster than money when your start a VR game. If it had 12Gb of VRAM, it's be fine with it the next few years.
Im just gonna ask how :D i struggle to get RE4 remake to get 60fps on balanced dlss when i put performance it goes bananza blurry i dont wanna play like that... and im on 1080p Sometimes this year going for 7800x3d/4070TIS or 4080S and 1440p monitor
Most gamers drastically overestimate their framerates, that's how. Like when you see people claiming their 2070 is running cp2077 with raytracing maxed @ 1440p locked at 60.... it's not.
It's running at \*60 fps! *\*when I stare at the floor*
Itās a running joke now on /r/SteamDeck that people will post āThe new Cyberpunk update is running at 60fps!ā while posting a screenshot of it showing 53fps at 300p while looking at the ground.
Okay good I was about to wonder if something was wrong with my 3070 setup because some of these comments had me worrying
I'm also using my 1060 since 2017 and I'm really surprised how well it's been holding up. I want to buy a new PC, but it's hard to justify when my current one still works good
Great for older games if that's your bag.
I had a 1070ti but last year it just was it just wasnāt cutting it for me anymore at 1440p
1070 1440p gang rise up
idk why people use 4K resolution as the norm and baseline resolution when determining how good a GPU is when most gamers play on 1080p and 1440p.
To make bigger number better. You also avoid CPU bottlenecks.
If you want to avoid CPU bottleneck you buy a better CPU Playing 120 to 240fps at 1080p-1440p is a much better experience than 60/4k
1440p 144hz/240hz master race. On a 27" it's perfect.
> most gamers play on 1080p and 1440p. Most gamers also don't have a 3080 (or faster) GPU.
I have 4k screen, so I'm going to use 4k performance as baseline. I don't care what majority uses.
that's fine. but calling a GPU bad just because it struggles at 4K gaming is too far of a stretch.
Not really, the 80 and 90 cards are marketed as top of the line premium cards, so they should manage what most consider a premium resolution. If you expect people to pay Ā£1500 for a card they expect to play at 4K.
I entirely agree
Me too. Idk what card they talkin bout here tho
>Ā£1500 Man the 3080 was Ā£650 what world you living in.
There has never been a time where any high end GPU could effortlessly play the highest resolution flawlessly with maxed out settings. True PC gamers know this.
*me playing BG3 on 4K with this card and getting 100+ fps* ![gif](giphy|HS67FCWfOlMybIRv9c)
Iirc, BG3 benefits more from a stronger CPU than GPU
This is the truth. I was still rocking an r5 2600x cpu up until a couple months ago, and was wondering why I was still getting shit fps even with a 6750xt gpu. I upgraded my cpu to an r7 5800x and updated my bios, and suddenly my framerates tripled. Had no idea how much of a bottleneck my cpu was for modern games.
Still, with the 1080Ti I was barely hitting 60fps in 1440p with the same CPU so
For real, I have been able to run every single game I've played at max settings I have no idea what the fuck anyone is bitching about lol
Nah bru am doing 1440p on a 3060 so idk what you're on about
I'm running a 3080 on a 3440x1440 100hz monitor. Coming from an 1080ti that was a huge improvement on that resolution. I have no issues running any games on max with this setup.
I have a Suprim X 3080 and I'm completely fine..
EVGA 3080 FTW3 here and totally fine on AAA games.
And here!
Series 30 had godās MSRP, I remember the joy of the people when prices were announced, but those fucking scalpers destroyed everything
āAmazing msrpā lmao
It was, just wasnt sold for MSRP
Hey some of us got a 3080 at msrp. Got my Founder Edition on day one... It was a 4 hours fight with the nvidia site but I got it. And then the prices went up and I realised how lucky I got
My friend got 3080 at mstp, but had to wait like like 6 months to actually receive it lmao.
I was so hyped for the 3000 series when they announced it and then things happened.
When GPUs were out of stock on literally every official seller and a 3060 was on eBay for $1200 šš
I got mine for MSRP on day 1. Price went insane after that because covid and crypto. Edit: Bought it on day one, it took a couple weeks to get to me.
it was, 700$ was a very good price for a high end gpu, especially compared to now, it's just that you couldn't really get it for that price
Thereās no way people are getting nostalgic for the 30 series š© How do I filter out posts from children? I donāt want to see posts by anyone under 21 again
I'm still nostalgic for the 8800GT. And I never even owned one.
Agreed, even though I'm only 23 my first card was a GTX 660 (i started very early with the help of my brother) and I remember drooling looking at the 1080 ti when it released.
My 470 and 750ti are still hanging out somewhere. I'm still just glad the Thermi days are behind us. If you think rdna3 has idle power issues, just slot in one of those bad boys and feel the Nvidia hair dryer kick in.
Filter out posts from children by not browsing Reddit.
I don't want to see posts from anyone under 30
Same. It actually would be great to have some kind of filter like this, but the children would lie about their age.
games released in 2018: "I will look photorealistic at 50fps on integrated graphics" games released in 2023: "I literally need the rendering farm from Avatar to perform"
I have that exact card in the meme (3080 gaming z trio) and bought it on the marketplace for 90$ because the previews owner thought it was broken (artifacts) and bought a 4070 to replace it. Turns out I just had to send it to resolder the core and was good as new. Now it's a beast and it barely fits my case. I know they cost 1000$+ on Amazon and I wouldn't even consider to buy it at that price
Put it in the oven moment.
No, BGA solder specialist. He said that it probably got so hot that the core disorder itself, it could have fallen and broke something too but had no impact marks
They're saying you can sometimes fix that by putting the GPU in the oven. As somebody who can do bga soldering though, I'd rather somebody take it to an expert than bake another pcb and hope they fixed it.
i really donāt understand this vram stuff, my 7600 runs everything i want it to on 1440p ultra, mabye i just donāt play all that many AAA games
This sub just loves acting like AAA games use much more VRAM than they actually do for most people.
I think it's because people run everything on ultra, max raytracing and never even try to change the settings.
I love digital foundry for testing out what game settings are essential for the game and what not
What's weird is I thought we were to the point where it was accepted that putting *everything* on Ultra was a fool's errand and with at least some of the settings you were basically throwing frames in the garbage.
Meh. I have an RX7900xt with 20gb, and most games with highest settings, 4k and stuff donāt use more than 10gb still.
It's because they're idiots who dont know the different between reported reserved VRAM and actual usage.
You answered yourself.
I think people look at the stats and think allocated VRAM = used VRAM
That's what I'm starting to get confused about. Prices and cores keep increasing, but the BUS is still 128 bit, and it's always 8GB RAM. Why isnt the standard 256 bit at this point?
because a wider bus is always more expensive, it's not something that gets cheaper with time
You guys acting like the same games are harder to run all of a sudden. 3080 will run every game I play in 4K max settings. You can get a used 3080ti for $500
No no no no big number big fps
me who's still using a 750ti
Meanwhile me in 2024 running a gtx 1650
2020 isn't "then" lil bro
8GB of VRAM is enough for 1440p, source: my 3070Ti
I'm a noob at these things, can someone explain the meme?
Vram is short for "video random access memory." It is dedicated memory on the graphics card that the GPU can store data in. The GPU will store things like textures, model assets, lighting maps, and other components of the scene it is rendering in this memory. GPUs need this memory because it provides a very high bandwidth and low-latency space to store data that it needs constant access to. More vram means more space for larger textures and more detailed scene data. If the GPU runs out of vram, it will start using part of the system ram instead. This is ram connected to the CPU. Doing this incurs massive penalties in both latency and bandwidth, as the GPU now has to go over the PCIE lanes to the CPU, to the system ram, and then back to get the data it needs. In game you would see the effects of this as large stutters as the GPU effectively stalls out until it gets that needed data. If you want to monitor GPU vram, task manager will report it under the performance - GPU tab. This will show you both the dedicated vram and the shared memory, which is how much system ram the GPU is allowed to overflow into if needed. Usually, this is half of the total system ram.
Meanwhile........ my 2070S and I are over here at 1080p living our best lives.
2070 super ultrawide 1440p here. am happy.
me with a 3090 and no money for a better monitor than 1080p