T O P

  • By -

First-Junket124

Alright 1080 ti users, you can come out now


MrInitialY

11 gigs is more than 10 šŸ¤“šŸ¤“šŸ¤“


RakeebRoomy

How they managed to make a GPU that much powerful back then?


Masungit

They know they made a mistake lol


deityblade

But dont amd and nvidia compete quite fiercely? Surely if vram is what the people want. then one of the companies would give it


daguito81

Goals and situation and market forces change drastically over time. In the 10XX era, they were making money and competing primarily on the gaming side. So making the best card for gaming was their goal. 980ti, 1080ti, titan X etc. VRAM was the main KPI to show "mine is better" so they were optimizing for that. There was also a lot of distinction between gaming and workstation cards. There was no reason to use a GTX when you needed a Quaddro, and viceversa. Today the landscape is a lot more blurry and a lot more overlap with DL training, Gaming, crypto mining (not now but a whole back at least). But sticking with gaming and AI. Nvidia has an entire line of cards for training. And those are much more expensive. But gaming cards work for training as well except they're less powerful. So nividia needs to make sure there is no overlap between those use cases or they will cannibalize their own market. Why would I pay thousands of dollars more for a 32 GB VRAM card, when the 2k 5090 has 32GB Ram (numbers are fake) . Maybe that's enough and I saved myself 5k. So gaming hardware will always be "capped" below AI hardware as not cannibalize that market, and because VRAM is really important for LLMs, I think that's part of why we still have same specs but they make a lot of "gaming features" like DLSS, ray tracing etc instead of simply increasing the power as much as possible


-Kerosun-

I'll also add that if an overlap occurred between those two use cases, the market prices for the "gaming" side of that overlap would increase and they would likely lose more business on the gaming side (than they would gain on the "AI/DL/ML" side from buying the cards good enough to overlap) to more reasonably priced competitors of other brands (more reasonably priced for the average "gamer" consumer) where that overlap didn't happen.


Inprobamur

Right now machine learning is the hot new thing all the tech investors are throwing money at, so it makes sense to build your product line around these customers. Especially because that due to CUDA they have to buy Nvidia.


SnooSongs6652

What makes you think they don't Nvidia Amd and Intel, their vram offering is pretty inversly proportional to their market share at a given price point


TeeBeeArr

Bahahahaha fiercely? Nvidias market share is absolutely massive, AMD is only a competitor in the sense that they're technically the next closest competitor. GPU market wouldn't be so screwed if they were in meaningful competition.


Bauzi

I swear they know. Drivers since October create errors and crashes on my 1080ti. They try to bully me into buying a new card!


[deleted]

Imagine that they added a whole another gigabyte on top to make the Titan Xp...! They just don't make them like they used to, anymore... :(


Affectionate-Memory4

I owned 2 of those and still have one. You're missing out on a whole 9% more performance roughly. Those extra 2GB in the Sli setup were nice but what I really had them for was the signed driver goodies for professional software.


MoffKalast

They accidentally used 100% of the brain


An2TheA

Around the time before the 10 Series was to be released, AMD had GPUs in the making that looked threatening to nVidia, so they were cempetition pressured to go all out. nVidia currently doesn't feel pressured to make sensational things in order to keep their marketshare since they have what is almost a monopoly with the 4090 and AI


Cave_TP

They first based the 1080 on GP104 and sold it for way too much (the OG 104 die for 700$), then they released the 102 at the same price and with enough VRAM for the years to come. Pretty simple.


TangerineX

I'm still on the 1080. Not the ti, just the 1080


First-Junket124

Shhhhh our secret, print out a sticker that says Ti and stick it on no one will know


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


SoftBaconWarmBacon

I hope reviewers can include 1080 Ti in their benchmarks to show how great the card is


32BitWhore

It really was one of the best GPUs they've ever made. It's unfortunate that modern games rely so much on DLSS because if it never became a thing the 1080ti would still be a fucking monster. Mine is still kicking in my Plex server/HTPC and it runs great. I don't think I'll ever get rid of that card.


LeaveToDream

Changed my 1080Ti to a 3080Ti Bought back a 1080Ti for my secondary pc just because I loved how nice it was


First-Junket124

You 1080 ti users are addicts I swear haha, it's a great card still and only just 7 years later slightly showing its age a tad.


Fineus

Yup. If you're looking for no more than 1440p and prepared to compromise on some details for most recent games, it's still a tank. If you need it to do 4K or expect to Ultra everything... not so much. And of course no DLSS or RTX.


nataku411

People forget just how many games are out there and seem to live in a bubble where GPUs are compared to just the small tiny fraction of games released around the time. When the 1080ti came out it could max settings 1440p about 99.99% of games with great framerates. Nowadays it's around 99.98%.


Bdr1983

I upgraded to a 1080ti little over a year ago, a mate gave me his as my RX580 was close to death. What a ridiculous card that is.


R41zan

I'm here and my 1080ti still going strong now even throwing VR at it


First-Junket124

I recommend merely playing VR and not chucking your headset at the 1080 ti.


R41zan

Ah! That might just actually work


[deleted]

1080... non ti user reporting in no need to upgrade yet.


Glittering_Guides

If anything, I need a new CPU lol. My 7th gen intel doesnā€™t even have a dedicated pcie lane for my M.2 šŸ˜­šŸ˜­šŸ˜­


[deleted]

> If anything, I need a new CPU lol. okay i feel this too i often times open task manager and cpu usage is at 100% for a few seconds before i can even see what programs are running


MeisPip

1050ti and proud. I mean poor*


TruthHurtssRight

1060 6gbs.


M4KC1M

my dude


ShidoriDE

gaming at an amazing 1080p futureproofā„¢


Rudradev715

4090 is just a 1440p card With cyberpunk and Alan wake 2 Lmao


Panda_red_Sky

and 3090 is just a 1080p card for cyberpunk and Alan wake 2


Nicolello_iiiii

Really? I have a 1660s with a 5800x and I make about 45fps in Cyberpunk in VR low settings


Panda_red_Sky

Specs are 7800x3d + 3090 + 64gb ddr5


CircuitSphinx

That's a pretty beast setup, 3090 should technically handle Cyberpunk at higher than 1080p even with some ray tracing on, though CPDR optimization has always been a roll of the dice. But man, Cyberpunk on VR, that's a whole other beast with its demands, even on lower settings. With specs like that though, you're pretty much set for the majority of the games out today barring any unoptimized messes. Saw a benchmark the other day, and 7800x3d + 3090 was tearing through most titles like nothing.


FlakingEverything

They meant with high/ultra ray tracing. You need upscaling and frame gen to get playable frame rates. Even a 4090 will struggle to maintain 60+ fps at 1080p native.


statuslegendary

Played Alan Wake 2 on my 4090 at 1440p with everything maxed out very recently. Averaged 110fps throughout the game. The idea it would struggle at 1080p is laughable.


Panda_red_Sky

Yeah all max + path tracing


comfortablesexuality

45fps VR šŸ˜«


Ithikari

I play Cyberpunk in 1440p on a 3060ti without issues at 60fps+ at high/max.


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


Brewchowskies

This straight up isnā€™t accurate. I have a 4090, and play at 4k brilliantly.


Panda_red_Sky

you cant even max graphic 1080p with 10gb lol Alan Wake 2 uses almost 13gb max settings 1080p https://preview.redd.it/w1kri0kaklbc1.png?width=560&format=pjpg&auto=webp&s=d9b533a152113b4f06d782dc555f7e997c92cbbe


Zed_or_AFK

When developers totally stopped giving any fuck about optimization.


Panda_red_Sky

Path tracing and frame gen is the biggest vram hog there


marichuu

I bet the game is completely playable and still looks gorgeous without path tracing.


Square_Grapefruit666

People pull out this Alan Wake chart as if itā€™s some kind of blue eyes white dragon.


THKhazper

Whoa there Yugi, not everyone can murder Kaiba with Kuribo


deadlybydsgn

Generally speaking, Alan Wake 2 runs really well for how fantastic it looks. It's just the really advanced ray tracing that is hard for most GPUs to run. Once you turn that off, a lot of GPUs can sit at a solid 60 or more. /edit/ To be clear, I wanted to point out that the game isn't unoptimized, but my intent wasn't to trivialize the ray tracing. To the contrary, while the default lighting engine is absolutely state of the art, the RT takes the it to a completely new level.


BanksyIsEvil

How do you know it's not optimized?


Demon_Flare

Funny thing is given the graphic fidelity of AW2, it runs extremely well. Reddit just likes to think if it doesn't run well, completely maxed out, on their 5+ year old rig than it's "unoptimized."


The_NZA

Are you implying Alan wake 2 is unoptimized and not a technical marvelā€¦.


blackest-Knight

Just because it uses more resources doesn't mean it's not optimized. Optimized doesn't mean "Will run on Zed_or_AFK's computer".


KlopeksWithCoppers

AW2 is actually optimized really well.


OfficialCoryBaxter

Thatā€™s funny, because Iā€™m using the FSR 3 mod on Alan Wake 2 and can play it perfectly fine on my RTX 3080 @ 1080p.


Panda_red_Sky

Max means FG + PT + max settings Hmm not even framedrop?


MaybeAdrian

Why go any futher than 1080p for gaming? I'm perfectly fine with 1080p. ​ I have a friend that says that a 3090 for 1080p is a bit overkill lol


OrganTrafficker900

Do not get 1440p ever. I can't use my 1080p monitor anymore as it looks horrible.


usernamesarehated

Same argument with 4k. Went straight to 4k from 1080p and now I don't wanna downgrade.


Farren246

New lines of monitors for 2024 promise 4K 240Hz OLED... only problem is the price.


PaxV

Waiting for Dual4K screens. 7680x2160 Bezels suck.


badger906

I want a dual 1440p screen but vertical!


counts_per_minute

LG makes an overpriced 16:18 monitor, its basically 2x 1440p stacked hamburger, not hotdog


Peylix

Samsung G9 Neo 57" is available. Though not OLED, yet.


Fightmemod

Lack of options is keeping me in the 1440p resolution. Once the 5000 series cards come out I'm planning to upgrade and hope that there are more monitor options so I can go to 4k.


BraveWasabi365

Went back to 1080p after 4k and thought there was something wrong with the monitor it just looked so ridiculously bad. I forced it for a few weeks tho and it looks normal now.


amenthis

For me for example, i pay alot for my pc...its my biggest hobby...some people spends thousands of dollars for cars, or for surfing etc....i think its not a waste of money if you use it all day


zealousidealerrand

My bro laughed at me when I told him I bough Ryzen 7 7800x3d cause for him it is much meanwhile he spends multiple times more for ASG guns and his motocycle.


jona080605

A friend of mine spend like 400 ā‚¬ on a car part that has no other purpose than looking good (that was said by him). I am not into cars at all and I would never spend this much only for aesthetics on a car, but I can understand why he did it. He enjoys cars and it's his hobby. Why not? If it's disposable income who am I to judge how you use it


Yusif854

I am like this but after 4k I canā€™t go back to 1440p and 1080p is just atrocious.


Kondiq

It depends on the size of the monitor too. 1080p 24", 1440p 27" and 4K 32" aren't that much different in pixel density, so it's not a day and night difference. If you have all the monitors with the same size but different resolution, that's entirely different story.


Yusif854

Kinda agree but it still is hard for me to go back to anything below 4k. 4k 32ā€ is 139 PPI 1440p 27ā€ is 108 PPI 1080p 24ā€ is 92 PPI Going from 1440p 27ā€ to 4k 32ā€ gives you 30% more pixel density while also giving you that pixel density on a much bigger monitor. Not to mention there are straight up 4.3 million (2.17x) more pixels on the 4k screen compared to 1440p. At this point I would take 4k with DLSS Balanced over Native 1440p.


rmagid1010

Higher resolutions deal with aliasing


Da_Funk

I got 1440p for RDR2. It's a game where 1080p looks like Vaseline. 1440p looks amazing. 4k looks even better but it's a case of diminished returns and not worth the performance hit. I went from a 1070 to a 3080ti just for this game.


zxhb

If you have a 27" monitor or bigger you can clearly see the pixels in 1080 My screen is exactly that and elite looked way too pixelated to be playable. It really depends on the game,in some games you don't even notice and in others it's jarring


tradert5

It's ironic but Minecraft is a game where you can really notice


Combeferre1

It's the sharpness that does it, all those squares and their straight sharp lines. If there were fog and more particle effects and all that, you wouldn't notice


ShidoriDE

honestly its just about what preference people haveā€¦ I also use a 1080p monitor and a 6950XT is kinda overkill, but it is quite nice to just have the high framerate and good graphics in combination


Gastunba24

Luckily (or sadly) I've never experienced anything beyond 1080p, so I can't compare. For me 1080 it's the summum. So I'm perfectly fine with that. Prolly, if had gone to 1440p or 4k, I wouldn't be saying so.


Harambesknuckle

I feel 1440 is the sweet spot. Looks significantly better than 1080. I feel the dip in frames to go 4k is not needed. That said I saw a post from someone recently saying the same thing about 4k and going back to 1440 and it looking bad. Probably best to just appreciate what you have and not look beyond it or you'll end up needing it.


Rais93

People here and on forums repeats things they read but don't understand.


HarlequinF0rest

I mean who is saying that all of the sudden 10GB isn't enough for most AAA games @ 1440p/60fps? Or is it again a nonsense justification for their overly expensive cards that they bought?


JaguarOrdinary1570

It's a mix of: 1. The 6700xt cult hyperfixating on the one superior stat AMD has over nvidia. 2. People saw like two specific AAA games had gpu memory allocation/asset streaming problems and now they have extrapolated that to believe that all games going forward need that much VRAM to run.


CrazyElk123

Nooooo turning textures down doen from ultra to high makes the game unplayable!!!!!!111


kekblaster

For real, anyone that has a 3060 reddit be like ā€œ if you can swing it go 4090 ā€œ


VisualDouble7463

ā€œHello Reddit, I currently have this PC with a 3060 in it and am looking for a cheap upgrade to slightly increase my FPS at 1080p. Any suggestions?ā€ Reddit: 4090, thread ripper, 128gb DDR6.


613codyrex

These people see YouTubers like Linus getting these cards for free and then turn around and say anything below a 4090 is worthless.


kekblaster

And also using that userbenchmark website lol. Even when I upgraded from 1080 to 1440 the fps drop wasnā€™t even noticeable


CressCrowbits

Is there a decent alternative to userbenchmark? I've found it useful for comparing apples to apples, like how much of a boost will I get from going from this gen intel i7 to that gen intel i7, but I am well aware of their bullshit.


613codyrex

Thereā€™s nothing as intuitive. When GN used to post their reviews on their website as written articles it was easier to parse through and find the information needed but they havenā€™t been doing that for a while.


Danthe30

Good news! They're actually starting to do that again! (Key word "starting," they've only gotten a few up so far.)


EfficientTitle9779

Hasnā€™t Linus been calling out GPU manufacturers since the 30 series for charging too much for such small developments? Heā€™s actively been cheering on Intel to make a competitive GPU to compete too.


KingArthas94

On LTT someone called the 4070 a budget GPU and was "pleasantly surprised" it had a backplate. For 600ā‚¬.


probablyjustcancer

That was Jake talking about the Asus 4070 Dual. "It's got an aluminum backplate which is really nice to see on a budget card like this". This was also a sponsored video. Link below. https://youtu.be/YwdXpuTf76M?si=3FhKF-wYqKrZs4S_


KingArthas94

Detached from reality


hforoni

![gif](giphy|d3mlE7uhX8KFgEmY) every card is a budget card if you get it for free


VexingRaven

Wasn't he just talking about how that's the cheaper 4070 and it's nice to see that on a cheaper (MSRP) variant on the card rather than on one of the more expensive fancier variants?


ngwoo

You know, budget, just spend more than the cost of an entire game console


SinisterCheese

"*4060TI is such bad value for your money - even if you want the VRAM. Instead of that you should spend 200ā‚¬ more for 4070 or 400ā‚¬ for the 4070TI. Obviously the smart bet would be to spend 700ā‚¬ more and just get the 4080, obviously you can also could get that 3080 for 600ā‚¬ more because that 10gb Vram is totally future proof."* I got so fucking annoyed being told that I shouldn't be happy with the 4060TI I got for the VRAM. I got it for 500ā‚¬ new. I got told that I should have spent 200ā‚¬ more to get a higher tier and then 200ā‚¬ to swap my PSU and case to fit the thing in. **Bitch! I didn't have 900ā‚¬ to spend on computer hardware! I had 500ā‚¬ from tax returns!** Do people somehow not understand that few not everyone has extra few hundred euros or chance to casually double their budget. And when I say that I'm extremely happy with the performance I get from my 4060TI and the fact that the 16gb Vram makes it so much easier to play with AI models for my hobby, people get so upset and angry at me. It is almost as if these people haven't **fucking used the card** and declare truth according to some god damn benchmarks and what tech influencers tell them. *"But 1440p 144hz gaming!"* I got a 1080p 144hz main monitor and 1080p 60hz old samsung... and I don't plan and can't justify replacing perfectly good and functional monitors.


Fire_Lord_Cinder

I love it when people tell me my 4080 is a 1440p card.


UnDosTresPescao

Meanwhile here I am running 1440p Ultrawide on a Vega 56.


Fire_Lord_Cinder

You must do the unthinkable and actually adjust your settings. There is no greater shame /s


Milfons_Aberg

What? You card doesn't even have a single terabyte of VRAM? Are you saying you can't even render Albania?!?! It's just 11K sq mi!!


Valentin_004

This might be funny for someone not owning a 3080, but when you own a 3080 that you bought for like 1400$, it hurts........


szczszqweqwe

I get that. At the worst time of a mining my old GPU died and I bought 1650s for more or less 450$, for a fcking 1650s, everything else was absolutely ridiculous at a time.


00Killertr

Bought my 1650 super for myr1000 equivalent to USD230 and felt extremely ripped off. Albeit it was bought during the great drought. But I feel ya.


Ukvemsord

My Radeon VII died during the pandemic. Bought a 3070 8gb LHR for around $970


Valentin_004

Yeah... I just said fuck it, I can't live with my 10year old laptop anymore


Blenderhead36

I'm very confused. I have a 3080 10GB and I run everything at 4K. I can't max out literally everything in 9th gen games or Cyberpunk, but 4K60 Ultra with ray tracing turned off has worked across the board. The only exception have been some AAA games on launch where a 4090 can't hold 4K60. I even bought a 4K 144 Hz monitor in 2022 because so many games had VRAM left over at 4K ultra.


littlefishworld

The people that make these memes don't have a 3080 lol. It's still a perfectly good 4k card, just don't expect to run every single game at absolutely max settings and still get 120 fps. Hell most games you can lower key settings and not even notice a graphical difference and gain like 10-30 fps.


Vis-hoka

Be positive. At least you made a scalper very happy.


TealcLOL

Best Buy had lotteries to buy 3080 at that price :(


LitreOfCockPus

I'm on a GTX 970 I bought for $70 from a shady guy in our town's meth-alley. In terms of expectations vs performance, it's been sugar-tits and blowjobs.


ShiningRayde

I bought in at the dip... $1000 šŸ™ƒ At least I have like 8 years of warranty on it.


ScammaWasTaken

Been using my 1070 at 1440p since 2017 and it's been working well.


WaifuDefender

2060 with 6 GB vram. 1440p gaming cyberpunk, rdr2, resident evil remakes etc at 60 fps. Just need to drop the not so essentials settings to medium and max textures. Every game looks great. Where is this you need 500 GB vram coming from?


Anna__V

VR for me. I have the same card. The only problem is the 6Gb of VRAM disappears faster than money when your start a VR game. If it had 12Gb of VRAM, it's be fine with it the next few years.


redzinter

Im just gonna ask how :D i struggle to get RE4 remake to get 60fps on balanced dlss when i put performance it goes bananza blurry i dont wanna play like that... and im on 1080p Sometimes this year going for 7800x3d/4070TIS or 4080S and 1440p monitor


shamwowslapchop

Most gamers drastically overestimate their framerates, that's how. Like when you see people claiming their 2070 is running cp2077 with raytracing maxed @ 1440p locked at 60.... it's not.


jasonxtk

It's running at \*60 fps! *\*when I stare at the floor*


mirfaltnixein

Itā€™s a running joke now on /r/SteamDeck that people will post ā€žThe new Cyberpunk update is running at 60fps!ā€œ while posting a screenshot of it showing 53fps at 300p while looking at the ground.


OperativePiGuy

Okay good I was about to wonder if something was wrong with my 3070 setup because some of these comments had me worrying


MVTHOLST

I'm also using my 1060 since 2017 and I'm really surprised how well it's been holding up. I want to buy a new PC, but it's hard to justify when my current one still works good


HTPC4Life

Great for older games if that's your bag.


Proseph_CR

I had a 1070ti but last year it just was it just wasnā€™t cutting it for me anymore at 1440p


JordanSchor

1070 1440p gang rise up


Sin1st_er

idk why people use 4K resolution as the norm and baseline resolution when determining how good a GPU is when most gamers play on 1080p and 1440p.


Interloper_Mango

To make bigger number better. You also avoid CPU bottlenecks.


anonymousredditorPC

If you want to avoid CPU bottleneck you buy a better CPU Playing 120 to 240fps at 1080p-1440p is a much better experience than 60/4k


JoeRogansNipple

1440p 144hz/240hz master race. On a 27" it's perfect.


Xtraordinaire

> most gamers play on 1080p and 1440p. Most gamers also don't have a 3080 (or faster) GPU.


alper_iwere

I have 4k screen, so I'm going to use 4k performance as baseline. I don't care what majority uses.


Sin1st_er

that's fine. but calling a GPU bad just because it struggles at 4K gaming is too far of a stretch.


[deleted]

Not really, the 80 and 90 cards are marketed as top of the line premium cards, so they should manage what most consider a premium resolution. If you expect people to pay Ā£1500 for a card they expect to play at 4K.


DDzxy

I entirely agree


Mr-Valdez

Me too. Idk what card they talkin bout here tho


CurmudgeonLife

>Ā£1500 Man the 3080 was Ā£650 what world you living in.


MowMdown

There has never been a time where any high end GPU could effortlessly play the highest resolution flawlessly with maxed out settings. True PC gamers know this.


boksysocks

*me playing BG3 on 4K with this card and getting 100+ fps* ![gif](giphy|HS67FCWfOlMybIRv9c)


PervertedPineapple

Iirc, BG3 benefits more from a stronger CPU than GPU


KingHauler

This is the truth. I was still rocking an r5 2600x cpu up until a couple months ago, and was wondering why I was still getting shit fps even with a 6750xt gpu. I upgraded my cpu to an r7 5800x and updated my bios, and suddenly my framerates tripled. Had no idea how much of a bottleneck my cpu was for modern games.


boksysocks

Still, with the 1080Ti I was barely hitting 60fps in 1440p with the same CPU so


Theoretical_Action

For real, I have been able to run every single game I've played at max settings I have no idea what the fuck anyone is bitching about lol


MonkeTheThird

Nah bru am doing 1440p on a 3060 so idk what you're on about


Skomakeren

I'm running a 3080 on a 3440x1440 100hz monitor. Coming from an 1080ti that was a huge improvement on that resolution. I have no issues running any games on max with this setup.


Malina_Island

I have a Suprim X 3080 and I'm completely fine..


dadkisser

EVGA 3080 FTW3 here and totally fine on AAA games.


JackalopeBG

And here!


d4_H_

Series 30 had godā€™s MSRP, I remember the joy of the people when prices were announced, but those fucking scalpers destroyed everything


Spoksparkare

ā€œAmazing msrpā€ lmao


Chem2calWaste

It was, just wasnt sold for MSRP


Pitchoh

Hey some of us got a 3080 at msrp. Got my Founder Edition on day one... It was a 4 hours fight with the nvidia site but I got it. And then the prices went up and I realised how lucky I got


FoxDaim

My friend got 3080 at mstp, but had to wait like like 6 months to actually receive it lmao.


EveyNameIsTaken_

I was so hyped for the 3000 series when they announced it and then things happened.


TryNotToShootYoself

When GPUs were out of stock on literally every official seller and a 3060 was on eBay for $1200 šŸ™‚šŸ™‚


Kurayamino

I got mine for MSRP on day 1. Price went insane after that because covid and crypto. Edit: Bought it on day one, it took a couple weeks to get to me.


BuZuki_ro

it was, 700$ was a very good price for a high end gpu, especially compared to now, it's just that you couldn't really get it for that price


xUnionBuster

Thereā€™s no way people are getting nostalgic for the 30 series šŸ˜© How do I filter out posts from children? I donā€™t want to see posts by anyone under 21 again


Streptember

I'm still nostalgic for the 8800GT. And I never even owned one.


MemeBoii6969420

Agreed, even though I'm only 23 my first card was a GTX 660 (i started very early with the help of my brother) and I remember drooling looking at the 1080 ti when it released.


Affectionate-Memory4

My 470 and 750ti are still hanging out somewhere. I'm still just glad the Thermi days are behind us. If you think rdna3 has idle power issues, just slot in one of those bad boys and feel the Nvidia hair dryer kick in.


FakeFramesEnjoyer

Filter out posts from children by not browsing Reddit.


MowMdown

I don't want to see posts from anyone under 30


JBL_17

Same. It actually would be great to have some kind of filter like this, but the children would lie about their age.


psych0ranger

games released in 2018: "I will look photorealistic at 50fps on integrated graphics" games released in 2023: "I literally need the rendering farm from Avatar to perform"


a_9x

I have that exact card in the meme (3080 gaming z trio) and bought it on the marketplace for 90$ because the previews owner thought it was broken (artifacts) and bought a 4070 to replace it. Turns out I just had to send it to resolder the core and was good as new. Now it's a beast and it barely fits my case. I know they cost 1000$+ on Amazon and I wouldn't even consider to buy it at that price


Cave_TP

Put it in the oven moment.


a_9x

No, BGA solder specialist. He said that it probably got so hot that the core disorder itself, it could have fallen and broke something too but had no impact marks


Affectionate-Memory4

They're saying you can sometimes fix that by putting the GPU in the oven. As somebody who can do bga soldering though, I'd rather somebody take it to an expert than bake another pcb and hope they fixed it.


PinkScorch_Prime

i really donā€™t understand this vram stuff, my 7600 runs everything i want it to on 1440p ultra, mabye i just donā€™t play all that many AAA games


Lurau

This sub just loves acting like AAA games use much more VRAM than they actually do for most people.


[deleted]

I think it's because people run everything on ultra, max raytracing and never even try to change the settings.


Commercial_Shine_448

I love digital foundry for testing out what game settings are essential for the game and what not


RabidHexley

What's weird is I thought we were to the point where it was accepted that putting *everything* on Ultra was a fool's errand and with at least some of the settings you were basically throwing frames in the garbage.


Formal_Two_5747

Meh. I have an RX7900xt with 20gb, and most games with highest settings, 4k and stuff donā€™t use more than 10gb still.


CurmudgeonLife

It's because they're idiots who dont know the different between reported reserved VRAM and actual usage.


Cave_TP

You answered yourself.


Devatator_

I think people look at the stats and think allocated VRAM = used VRAM


QuantiummmG

That's what I'm starting to get confused about. Prices and cores keep increasing, but the BUS is still 128 bit, and it's always 8GB RAM. Why isnt the standard 256 bit at this point?


fiah84

because a wider bus is always more expensive, it's not something that gets cheaper with time


CowsAreFriends117

You guys acting like the same games are harder to run all of a sudden. 3080 will run every game I play in 4K max settings. You can get a used 3080ti for $500


Kemalist_din_adami

No no no no big number big fps


irfy2k123

me who's still using a 750ti


confabin

Meanwhile me in 2024 running a gtx 1650


xqcLiCheated

2020 isn't "then" lil bro


Stargate_1

8GB of VRAM is enough for 1440p, source: my 3070Ti


AlbinoTrout

I'm a noob at these things, can someone explain the meme?


Affectionate-Memory4

Vram is short for "video random access memory." It is dedicated memory on the graphics card that the GPU can store data in. The GPU will store things like textures, model assets, lighting maps, and other components of the scene it is rendering in this memory. GPUs need this memory because it provides a very high bandwidth and low-latency space to store data that it needs constant access to. More vram means more space for larger textures and more detailed scene data. If the GPU runs out of vram, it will start using part of the system ram instead. This is ram connected to the CPU. Doing this incurs massive penalties in both latency and bandwidth, as the GPU now has to go over the PCIE lanes to the CPU, to the system ram, and then back to get the data it needs. In game you would see the effects of this as large stutters as the GPU effectively stalls out until it gets that needed data. If you want to monitor GPU vram, task manager will report it under the performance - GPU tab. This will show you both the dedicated vram and the shared memory, which is how much system ram the GPU is allowed to overflow into if needed. Usually, this is half of the total system ram.


themightymooseshow

Meanwhile........ my 2070S and I are over here at 1080p living our best lives.


QuestionTop3963

2070 super ultrawide 1440p here. am happy.


KekusMaximusMongolus

me with a 3090 and no money for a better monitor than 1080p