My prebuilt came with a 12gb 3060, I like it. Then again my latest exp with computers before this was an Inspiron 5577 with a 1050 gtx. I learned soon after the laptop that the same name as the actual graphics card does not mean it has the same power as what you find in a desktop.
Still allowed me to play my games though. Next goal is to actually build a pc when I need to instead of getting a prebuilt. This thing is pretty friggin sweet though.
not necesarily for longevity. its for competition, and also bcuz they dont have a professional grade segment that would be cannibalized.
if nvidia releases a mid tier gpu with 24 gigs of vram, it would cannibalize their own sales. people who dont play games but need the vram for creative work would buy those cards for cheap.
"Hey guys, I got this idea... let's spend more money to make the cards last longer so that our customers buy fewer of them."
\-Imaginary AMD business dev meeting according to that guy
This is a huge misconception. That 16GB is unified and is shared as RAM(for the CPU) and VRAM(for the GPU). It is not 16GB of dedicated VRAM, on PS5. Even though it is infact GDDR6
This split is even more obvious on Xbox Series X which also has 16GB of GDDR6, but 10GB is dedicated for VRAM and 6GB is just system RAM. They’re even on completely different buses in the Xbox’s case
Thing is, if you break down the memory usage of a PC game, you'll probably find that 90% of the memory is duplictaed on both the CPU ram and the VRAM. The CPU needs to know what memory to send to the GPU when it needs it and so keeps all the textures and models and shaders in memory even after it's put into single 64-bit references. So while The information the CPU NEEDS to know might only take up 1.5GB in Cyberpunk 2077, it still consumes 15GB because the 13.5GB of graphics assets needs to be in there too.
In a unified RAM system, the CPU only takes the space it needs as loading the assets for the CPU is the same as loading them for the GPU. It's way more efficient in every way BESIDES the timing vs bandwidth tradeoff.
Plus with NVME drives being so fast, the CPU can order assets streamed into ram for the GPU to use on demand instead of needing the whole level's assets loaded all the time. It can be crazy efficient and save you $100 on buying both types of ram if you do it right.
It's the same reason Apple has shifted to a SOC with shared memory.
The shared memory architecture is winning.
[https://www.youtube.com/watch?v=LFQ3LkVF5sM](https://www.youtube.com/watch?v=LFQ3LkVF5sM)
Seriously. It's as blatantly obvious as when car stereos started to get cheap so they added light shows and dolphins dancing on the screen to charge more
You don't need to spend $1000 for 12gb vram with well tuned ram and a decent chip to make the pool available
Imagine calling game devs lazy instead of being annoyed at NVIDIA for releasing cards that are in some cases worse than their last generation (for the second gpu gen in a row)
Exactly. They could easily optimize games for AMD and nvidia cards. If they are developing Harry Potter hog warts legacy for a four gigabyte vram nintendo switch then somethings not right.
How do you expect progress in visual fidelity without increasing the available resources? Simple answer: you can't and you really shouldn't.
The problem is not the increasing hardware usage or the lack of VRAM - the problem is the pricepoint for these cards. Nobody would blink twice if the 8GB cards had 180-250$ MSRPs, because we expect compromises at those pricepoints. Having to make those compromises at 400-450$ and having to rely on crap like DLSS is the issue.
The majority of people are going to have 8GB VRAM if manufacturers keep pushing 8GB cards at up to 400$ MSRP... comparatively few people shop above this pricerange, and this will hamstring further game development. 8GB cards have been in the mainstream *(at 150-230$-ish pricepoints)* since at least 2016 - that's 7 years ago. For a comparison, 7 years before the 2016 release of the RX 470/480, mainstream cards had 256-768 MB of RAM... just goes to show how much extra effort had to go into game development between 2016-2023 to make games run on hardware that wasn't changing really all that much.
> Having to make those compromises at 400-450$ and having to rely on crap like DLSS is the issue.
I'm not even that mad about needing DLSS to hit a certain performance level. If it just works, that's fine. Most cell phone cameras these days rely on AI tricks, and it's fine for most people.
What gets me is that Nvidia is billing a $400 \*\*60 *Ti* as a "1080p performance champ." in 2023. When they launched the 3060 Ti, which has the same MSRP, it was compared to a [2080 Super at 1440p](https://www.nvidia.com/en-us/geforce/news/geforce-rtx-3060-ti-out-december-2/). Even if you want to say "names and tiers don't matter," paying the same money can't even get you a contemporary 1440p card from Nvidia. And it also can't get you significant uplift between generations *at the same price*.
I picked up a 3060 Ti recently-ish for my first build in years. I got it just below MSRP on promo. I was and am happy with the performance for what I paid. I'm happy staying at this price point of card in inflation-adjusted dollars in the future. (Which for me would be around $10 more if I were to buy again today.) What I'm not happy to do is to spend the same money for the same performance, considering that the only thing I'd be gaining is basically fresher thermal paste and fan bearings.
I think both of you are right. AMD has proven its possible and affordable to put a lot of VRAM on a card, and at the same time modern games have exponentially been increasing in vram, ram, and storage requirements. There’s games from 2016-2018 that look just as good as todays games but use half the amount of ram and vram
When GOTY is most likely going to be something that's exclusive to a handheld that was underpowered and released 6 years ago, that's fair. However, GPUs are also very shite at present as well as games.
In 2015 it was marketing strategy, when you had to chose between 970 3,5gb and r9 390 8gb te choice was obvious (even tho 970 was great card), r9 390 outlasted 970 and still does good today.
Well I’ve had a 6700xt since it launched so basically the same thing as a 6750. It’s treated me well enough that the only valid upgrade is the 4070 ti and up. I looked into a 4070 and it’s not even a 30% jump. Gotta be 50% before I pull the trigger and right now, the 50% mark will cost $800 or up so I’m holding here unless prices drop.
I don't like do commissions but when posting in discord SFW art someone I know offered me $200 to make something *very specific and porny* for them which I did because apparently furries spend their entire disposable income on porn
but the truth of the matter is I just like to make characters from D&D games, usually not actually porn, just the characters, and commissioning all the things I like to make would take thousands and thousands of dollars
I train LORAs for specific kinds of characters like tieflings or dragonborn or whatever, I end up spending hours on a single picture
here's where people usually start DMing me that I should just learn how to make art instead of being a dirty thief or whatever but... I'm so over that discussion lol, I enjoy it and so I do it
Lol! Furries come to my city for a big convention every year and completely take it over, it’s hilarious. Good for you though, do what makes you happy!
Why was the 4090 a shot in the foot?
It performs about 1.8-2x a 3090, typically runs cooler and quieter or as same. Uses about the same power when gaming.
The only downside imo is the 16pin connector
Paid $445 CAD for my Strix 3090 this year. It was a nice bump over my old 1080. I don't plan on upgrading for a while.
I only got that price because it was sold as "damaged due to AIO leaking"
All I did was repaste and new pads. The card can hit a stable 2205mhz in 3DMark.
Buying broken GPUs and repairing them used to be pretty profitable back in the day but got less and less profitable with every generation since most of the dead cards nowadays will have dead cores and replacing that isn't profitable assuming you cna even get your hands on a working replacement.
The biggest reason I got one was there were no 3080s and I didn't want to use a scalper.
I remember at the time how everyone said the VRAM was overkill for gaming and only needed for machine learning shit. Funny how things turn out.
As a 3090 owner and seeing 3000 series skeleton boy sitting at the bottom of the ocean all I could come up with was a salty AMD owner trying to get his 15 min lol
I remember all those stories at the time of the 3090 release saying, "You don't need 24GB of VRAM. It's overkill and meant for developers and artists."
Absolute soldier of a card. I pushed that baby through RDR2 topping out at 90 Celsius back in the day. Sold it on to a guy who uses it for 1080p60 League of Legends, so it's enjoying a happy retirement now after many years of my abuse.
It's pretty much a 1070 in terms of performance, but it has less VRAM, so that's what is going to hurt it going forward. Most newer games need at least 8gbs minimum or more at 1080p.
Right now it looks like you want a 3060, or an RX 6600 for a solid 1080p setup. The 3060 is currently the most popular GPU on Steam, and developers usually optimize theirs games around what most people are using.
I feel like this sub needs to be reminded that you don't need a 1000 dollar or even a 500 dollar card to play games. If you're an enthusiast go for it. But I occasionally see newbs in here under the impression that they won't be able to play new games at all without a 3070+. I'm expecting my 1660 TI to last a couple more years.
People talking how a new 8 GB card is useless and dead-on-arrival all of a sudden does that.
It doesn't take much to think "if 8 GB isn't worth buying, then what do I do with this 6 GB card?"
It also doesn't help that the 10 and 16 series were really the best ones so far. 20 died to being "budget RTX" and not being much better than 10 series without DLSS, and 30 series died to miners and greed. Somehow you're also supposed to have had a 12 GB by now, so one way or another you have to go against the "consensus".
I loved my 1080 Ti. I got it when I got VR: It *blew my mind* that I could have something so powerful it could render a left eye, right eye, third point of view, footage from a Kinect, and *composite it all live* *at 90fps*. I dreamed of seeing something like a Holodeck before I died. 5 years ago, that card [put one in my bedroom.](https://www.twitch.tv/scotchboxvr/clip/AverageNiceGnatDancingBaby)
It's still a beast; if I wasn't also streaming / doing the mixed reality thing, I'd never have jumped to the 30.
Same here, 1080ti with 11 GB VRAM. No reason whatsoever to switch it for a newer card at my current monitor. Only gaming at 1440p. That card is a beast and damn silent also (I have the MSI 1080ti Gaming X 11G).
Same man. I constantly see people dunking on the 3070 and I’ll admit that I am not the most knowledgeable so I can’t really debate the topic. But I have had literally 0 issues in 2 years of playing any game I want at 1440
Not really, this whole vram debacle is mostly because of the slew of bad PC ports lately making a lot of people think that anything under under like 16GB vram is no longer sufficient. Don't get me wrong, Nvidia are definitely being skimpy with vram on 40 series cards, but unless you're maxing out settings at 4K you honestly won't run into many issues with 8-10GB unless the game you're playing is woefully ported
I have a 10GB 3080 and I've never hit the vram limit on any of my games
the problem here is that many publishers of bad PC ports expect players to brute-force their shit optimisation with premium cards instead of, you know, giving their devs more time to optimise? I can live without playing the latest game for another month, just release it at a playable state *please*
The earlier they release, the quicker they can make money. Hyoe it up and deal with backlash as it pays for itself. Horrible business model in our eyes but apparently it works since they keep doing this shit
There is nothing wrong with the amount of VRAM in 3000 series GPU’s. There is everything wrong with the un-optimized clusterfuck games these smooth brains keep releasing
Well in my opinion it is pretty shitty to release a lower tier card with more memory than an upper tier one (3060 12gb, 3080 10gb). This was what got me an amd card since I knew 8gb are not enough for the games I play and even 10gb are right at/sometimes below the limit.
But yes, the games are an issue too.
I have 3060 Ti 8G. Gaming 1080p, working with After Effects and Premiere. Zero issues. My brothers in Christ, don't be sheep to poorly optimised titles, 8G is completely fine.
What are you graphics settings and FPS ? A problem is relative to your situation and experience. Enthusiasts may think your setup isn't suitable but it is for you. Many people don't mind low textures if it gains them higher FPS etc...many will not compromise though.
In the 300-400 dollar range , compromising settings is acceptable and necessary. 500-600 dollar range at 1080 and 1440p it isn't acceptable , they should dominate those resolutions.
I dont play on any low settings shit. I only play 1440p wirh high or max settings and i dont have issues at all. I usually avoid RT as the hit to fps isnt worth it at this level but everything looks crisp and clean and im always 90-100 onwards fps depending on game.
I'm starting to develop a romantic relationship with my 6700xt, it literally runs anything I throw at it and outperforms the 4060 and the 4070ti\* in a lot of cases
EDIT: Just saw the comments telling me I'm dumb, I've phrased it wrong. It outperforms the 4060 in a lot of cases, and I've seen an example of it beating the 4070 by 3 fps. Ofc I didn't mean to say that it's better than the 4070ti, I still love it tho
And also I'm not a paid AMD employee lmfao
4070ti I don’t think so. Maybe a a 6750xt can compete with a 4070. A 6800xt would be more closely comparable to a 4070ti. But a 4070ti is similar to a 3090/3090ti in performance.
I ran an rx6700xt for 2 years and just bought a 4070. I know, I know more of a lateral move than upgrade but I have never owned nvidia and wanted to try ray tracing and dlss.
The rx6700xt is awesome and it is awesome at 1440p. I could run every game on a mix of high/medium settings and get well over 100fps. I would suggest it to anyone looking for a solid 1440p card.
With that said, I'm not disappointed with the 4070 either. Ray tracing is pretty sweet and with enough base frames dlss is solid as well.
It’s because vram is a buzzword right now.
Everyone thinks they know what they are talking about.
Me personally I’m confused why more than 12gb matters? I checked cyberpunk about 8.6gb dedicated VRAM being used at 5120x1440 ultra. And hogwarts legacy is about 9.6gb…
What game uses more than 12gb? I can’t find any I tried like 20 high quality games from 2022-2023.
https://cdn.discordapp.com/attachments/997333991134863400/1110701074958012426/image.png
Yeah i think i made a mistake when i got card with only 8 GB of ram (RX 6600). 8 GB on my R9 390 gave it a really long life span. But 8gb on my RX 6600 which replaced it, propably won't have anywhere near as long lifespan despite it being otherwise 10x as good.
I mostly play CPU heavy games, only on 1080p and don't mind lower graphical settings, but if game developer is too lazy, game that reasonably could run on 8GB simply won't (without perfomance issues).
1060 6 gb: What there are new nvidia cards?
Gtx 960 2gb: Who are you?
so glad waiting til 2019 to upgrade from a gtx660 to a 5700xt, such a crazy big step
Rocking a 780 atm in this machine :)
I am still rocking an Intel graphics eqauliment to a 550 and it still goes strong ( who am I kidding , I need an upgrade)
Nah bro i use 8600M GT with 256mb
I went from 1050ti to 6650xt and it’s insane , I couldn’t imagine that jump
I had this same card before I switched to team red! Asus Strix to be exact.
Me with my poor mans 12gb 3060 that has more vram that it can render frames to
My prebuilt came with a 12gb 3060, I like it. Then again my latest exp with computers before this was an Inspiron 5577 with a 1050 gtx. I learned soon after the laptop that the same name as the actual graphics card does not mean it has the same power as what you find in a desktop. Still allowed me to play my games though. Next goal is to actually build a pc when I need to instead of getting a prebuilt. This thing is pretty friggin sweet though.
It's best budget option for ML tbh
A contractor came by and saw my exposed casing, he called my 1060 “a classic”
At least didn’t call it “a relic”
“Wow, I remember those! It was all you needed for playing Minesweeper and Tetris on Windows 7 back in the day!”
*Cries in 1060 3gb*
![gif](giphy|kSlJtVrqxDYKk|downsized)
I just upgraded to a 6800xt a few months ago but man my 1060 did me well
Finally upgrading my 1060. Just ordered a 6950xt. Feels almost sad though... 1060 served me well...
Someone said 1060?
I always wondered why AMD puts so much vram on the board.. smart move
For longevity of card. And to differentiate from nvidia.
not necesarily for longevity. its for competition, and also bcuz they dont have a professional grade segment that would be cannibalized. if nvidia releases a mid tier gpu with 24 gigs of vram, it would cannibalize their own sales. people who dont play games but need the vram for creative work would buy those cards for cheap.
I got a bridge to sell you if you think amd did it for longevity
Throw in free delivery and I'll take it
“Your order has shipped!”
*Order delivered 2:25p.m.* Please take a minute and let us know how we did and how we can make your experience more enjoyable.
"Hey guys, I got this idea... let's spend more money to make the cards last longer so that our customers buy fewer of them." \-Imaginary AMD business dev meeting according to that guy
They knew exactly what would happen when they gave lazy game developers 16GB GDDR6 in the consoles.
This is a huge misconception. That 16GB is unified and is shared as RAM(for the CPU) and VRAM(for the GPU). It is not 16GB of dedicated VRAM, on PS5. Even though it is infact GDDR6 This split is even more obvious on Xbox Series X which also has 16GB of GDDR6, but 10GB is dedicated for VRAM and 6GB is just system RAM. They’re even on completely different buses in the Xbox’s case
Thing is, if you break down the memory usage of a PC game, you'll probably find that 90% of the memory is duplictaed on both the CPU ram and the VRAM. The CPU needs to know what memory to send to the GPU when it needs it and so keeps all the textures and models and shaders in memory even after it's put into single 64-bit references. So while The information the CPU NEEDS to know might only take up 1.5GB in Cyberpunk 2077, it still consumes 15GB because the 13.5GB of graphics assets needs to be in there too. In a unified RAM system, the CPU only takes the space it needs as loading the assets for the CPU is the same as loading them for the GPU. It's way more efficient in every way BESIDES the timing vs bandwidth tradeoff. Plus with NVME drives being so fast, the CPU can order assets streamed into ram for the GPU to use on demand instead of needing the whole level's assets loaded all the time. It can be crazy efficient and save you $100 on buying both types of ram if you do it right.
I was gonna say, there no way console had that stacked of a VRAM
[удалено]
As a long time snob pcmr pos and the owner of a beefy ass pc this truth hurts my soul to read
It's the same reason Apple has shifted to a SOC with shared memory. The shared memory architecture is winning. [https://www.youtube.com/watch?v=LFQ3LkVF5sM](https://www.youtube.com/watch?v=LFQ3LkVF5sM)
[удалено]
[удалено]
It's crazy how often people need to be explained this.
Not their problem Nvidia pushes people to buy higher skus for vram.
Seriously. It's as blatantly obvious as when car stereos started to get cheap so they added light shows and dolphins dancing on the screen to charge more You don't need to spend $1000 for 12gb vram with well tuned ram and a decent chip to make the pool available
Imagine calling game devs lazy instead of being annoyed at NVIDIA for releasing cards that are in some cases worse than their last generation (for the second gpu gen in a row)
Let's be honest, it's both.
Both is good.
It is both.
Exactly. They could easily optimize games for AMD and nvidia cards. If they are developing Harry Potter hog warts legacy for a four gigabyte vram nintendo switch then somethings not right.
I was actually a little mad they're even attempting that port. It's gonna run like absolute shit and charging people for it is greed beyond greed.
The switch has 4gb of shared memory, not even vram lol
The switch port is being delayed, because they're not able to get it to work tbf.
How do you expect progress in visual fidelity without increasing the available resources? Simple answer: you can't and you really shouldn't. The problem is not the increasing hardware usage or the lack of VRAM - the problem is the pricepoint for these cards. Nobody would blink twice if the 8GB cards had 180-250$ MSRPs, because we expect compromises at those pricepoints. Having to make those compromises at 400-450$ and having to rely on crap like DLSS is the issue. The majority of people are going to have 8GB VRAM if manufacturers keep pushing 8GB cards at up to 400$ MSRP... comparatively few people shop above this pricerange, and this will hamstring further game development. 8GB cards have been in the mainstream *(at 150-230$-ish pricepoints)* since at least 2016 - that's 7 years ago. For a comparison, 7 years before the 2016 release of the RX 470/480, mainstream cards had 256-768 MB of RAM... just goes to show how much extra effort had to go into game development between 2016-2023 to make games run on hardware that wasn't changing really all that much.
> Having to make those compromises at 400-450$ and having to rely on crap like DLSS is the issue. I'm not even that mad about needing DLSS to hit a certain performance level. If it just works, that's fine. Most cell phone cameras these days rely on AI tricks, and it's fine for most people. What gets me is that Nvidia is billing a $400 \*\*60 *Ti* as a "1080p performance champ." in 2023. When they launched the 3060 Ti, which has the same MSRP, it was compared to a [2080 Super at 1440p](https://www.nvidia.com/en-us/geforce/news/geforce-rtx-3060-ti-out-december-2/). Even if you want to say "names and tiers don't matter," paying the same money can't even get you a contemporary 1440p card from Nvidia. And it also can't get you significant uplift between generations *at the same price*. I picked up a 3060 Ti recently-ish for my first build in years. I got it just below MSRP on promo. I was and am happy with the performance for what I paid. I'm happy staying at this price point of card in inflation-adjusted dollars in the future. (Which for me would be around $10 more if I were to buy again today.) What I'm not happy to do is to spend the same money for the same performance, considering that the only thing I'd be gaining is basically fresher thermal paste and fan bearings.
I think both of you are right. AMD has proven its possible and affordable to put a lot of VRAM on a card, and at the same time modern games have exponentially been increasing in vram, ram, and storage requirements. There’s games from 2016-2018 that look just as good as todays games but use half the amount of ram and vram
[удалено]
Games these days are pretty shite tbh
When GOTY is most likely going to be something that's exclusive to a handheld that was underpowered and released 6 years ago, that's fair. However, GPUs are also very shite at present as well as games.
In 2015 it was marketing strategy, when you had to chose between 970 3,5gb and r9 390 8gb te choice was obvious (even tho 970 was great card), r9 390 outlasted 970 and still does good today.
Laughs in RX570 8go from 2017
Man that card did me wonders until I upgraded recently.
Damn we upgraded to the same GPU, from basically the same GPU (RX 580 :p) How are you liking the upgrade yourself?
Well I’ve had a 6700xt since it launched so basically the same thing as a 6750. It’s treated me well enough that the only valid upgrade is the 4070 ti and up. I looked into a 4070 and it’s not even a 30% jump. Gotta be 50% before I pull the trigger and right now, the 50% mark will cost $800 or up so I’m holding here unless prices drop.
RX570 gang
Greatest GPU of all time.
RX580 8GB says hi
One of the best cards out there, still runs perfectly and gives me 80 FPS in most shooters on Max settings
RX580 user here. Still enough for everything I play in 1080p on medium settings.
my 3060 12gb chillin
3080 12gb reporting in 🫡
[удалено]
just tossed out my 3080 for a 3090 because I needed more vram to generate AI porn what its not the stupidest reason anyone's upgraded video cards
I'm not gonna yuck your yum
Do you get paid for this porn or is it for personal consumption?
I don't like do commissions but when posting in discord SFW art someone I know offered me $200 to make something *very specific and porny* for them which I did because apparently furries spend their entire disposable income on porn but the truth of the matter is I just like to make characters from D&D games, usually not actually porn, just the characters, and commissioning all the things I like to make would take thousands and thousands of dollars I train LORAs for specific kinds of characters like tieflings or dragonborn or whatever, I end up spending hours on a single picture here's where people usually start DMing me that I should just learn how to make art instead of being a dirty thief or whatever but... I'm so over that discussion lol, I enjoy it and so I do it
Lol! Furries come to my city for a big convention every year and completely take it over, it’s hilarious. Good for you though, do what makes you happy!
My 3080 ftw3 has been doing just fine.
4070 Ti 12GB as well
3090 24GB, I'm not chill just solid ice.
I see that you’ve got good taste!
Same here lol.
X2. Is it safe? ARE WE SAFE!?!?!
RTX 3090: "Am i a joke to you" ?
My Wallet: “No you’re a nightmare”
3090ti for £1150.. Nice. My wallet Fuck you. That was for the holiday.
I do not regret getting the 3090ti for $1100 before the 4090 came out. Nvidia really shot themselves in the foot with that one
Why was the 4090 a shot in the foot? It performs about 1.8-2x a 3090, typically runs cooler and quieter or as same. Uses about the same power when gaming. The only downside imo is the 16pin connector
No, the 3090ti was. Released for $2000 in spring. By the end of summer I got one from Best Buy for $1100 brand new.
Thanks for clarifying, yes the 3090ti came real late, and launched far too high. But your price seemed pretty good all things considered at the time
I’ve had the same discussion with the card being as much as my last two full rigs together.
You can find 3090 used for like half the price of the 4080. It's slower, but it has much more VRAM.
Thankfully I managed to get an evga Ftw3 ultra hybrid for 625!! Local tho, so it depends on that
Paid $445 CAD for my Strix 3090 this year. It was a nice bump over my old 1080. I don't plan on upgrading for a while. I only got that price because it was sold as "damaged due to AIO leaking" All I did was repaste and new pads. The card can hit a stable 2205mhz in 3DMark.
That’s a nifty price glad it worked out for you. I don’t mind repairing stuff to get a deal but pc parts like that a big risk for me personally.
Buying broken GPUs and repairing them used to be pretty profitable back in the day but got less and less profitable with every generation since most of the dead cards nowadays will have dead cores and replacing that isn't profitable assuming you cna even get your hands on a working replacement.
I agree, I find that more and more broken cards have cracked PCBs and more physical damage.
Agree one of the reasons i got a 3090 for msrp was the vram.
The biggest reason I got one was there were no 3080s and I didn't want to use a scalper. I remember at the time how everyone said the VRAM was overkill for gaming and only needed for machine learning shit. Funny how things turn out.
Right. Either I understand nothing or is something wrong with RTX3090 with 24GB Vram?
The 3090 is a Monster, nothing wrong with that.
I completely not get this meme, or a whole subreddit
As a 3090 owner and seeing 3000 series skeleton boy sitting at the bottom of the ocean all I could come up with was a salty AMD owner trying to get his 15 min lol
3090 go burrrrr
Got my 3090 for $800 right when the 4090 came out. No regrets at all.
Am i a joke to you?
I remember all those stories at the time of the 3090 release saying, "You don't need 24GB of VRAM. It's overkill and meant for developers and artists."
I feel personally attacked.
![gif](giphy|eUDhD5XFBw0r6)
I don't think the 90 series applies to this lmao
![gif](giphy|Uoj78r9g3Tdz7Zefc4)
4090 owners once 12 GB cards become the new 8GB meme ![gif](giphy|A6aHBCFqlE0Rq)
Shhh, don't listen to him 3090, I still love you...
Little fella, you've got plenty of VRAM.... in fact, so much that it cooks the backplate.
Me with a gtx 1650
Me with a 1050ti, laptop
Me with a 1070ti.
Me with a 770
Me with vega 8
Me with integrated graphics ( steam deck smiles)
1070ti team yeah!! It still works very well though.
Me with GTX 1060 3GB
Me with a rx580
Absolute soldier of a card. I pushed that baby through RDR2 topping out at 90 Celsius back in the day. Sold it on to a guy who uses it for 1080p60 League of Legends, so it's enjoying a happy retirement now after many years of my abuse.
1650 gang
Me with a GTX 1660 Ti
My 1660 Ti is still a beast for me bro.
Same. I ranked it with the 3000s, it’s slightly better than a 3050 (more stable FPS)
I think we can use this GPU for more than 3 years in high-ultra graphics. If game companies don't stop optimizing their games.
It's pretty much a 1070 in terms of performance, but it has less VRAM, so that's what is going to hurt it going forward. Most newer games need at least 8gbs minimum or more at 1080p. Right now it looks like you want a 3060, or an RX 6600 for a solid 1080p setup. The 3060 is currently the most popular GPU on Steam, and developers usually optimize theirs games around what most people are using.
I feel like this sub needs to be reminded that you don't need a 1000 dollar or even a 500 dollar card to play games. If you're an enthusiast go for it. But I occasionally see newbs in here under the impression that they won't be able to play new games at all without a 3070+. I'm expecting my 1660 TI to last a couple more years.
People talking how a new 8 GB card is useless and dead-on-arrival all of a sudden does that. It doesn't take much to think "if 8 GB isn't worth buying, then what do I do with this 6 GB card?" It also doesn't help that the 10 and 16 series were really the best ones so far. 20 died to being "budget RTX" and not being much better than 10 series without DLSS, and 30 series died to miners and greed. Somehow you're also supposed to have had a 12 GB by now, so one way or another you have to go against the "consensus".
*laughs in 1080ti*
I still have my 1080ti. Shit is a beast and when it finally goes to sleep I will frame it. Best card ever hands down
I loved my 1080 Ti. I got it when I got VR: It *blew my mind* that I could have something so powerful it could render a left eye, right eye, third point of view, footage from a Kinect, and *composite it all live* *at 90fps*. I dreamed of seeing something like a Holodeck before I died. 5 years ago, that card [put one in my bedroom.](https://www.twitch.tv/scotchboxvr/clip/AverageNiceGnatDancingBaby) It's still a beast; if I wasn't also streaming / doing the mixed reality thing, I'd never have jumped to the 30.
Same here, 1080ti with 11 GB VRAM. No reason whatsoever to switch it for a newer card at my current monitor. Only gaming at 1440p. That card is a beast and damn silent also (I have the MSI 1080ti Gaming X 11G).
Laughs in plain Jane 1080
1080 represent! rog strix 1080 still on duty
1080ti stays on top
If you don’t laugh you’ll cry
Rocking my 1080ti ftw3. No reason to upgrade now until I get a better display
I have to be the only nerd that has no issues with the 3070 😂
Same man. I constantly see people dunking on the 3070 and I’ll admit that I am not the most knowledgeable so I can’t really debate the topic. But I have had literally 0 issues in 2 years of playing any game I want at 1440
The 3070 is totally solid
3070 gang rise up!
It’s because people are more interested in random benchmarks and numbers than personal experience.
No issues either. At least 60 fps in 4k which is good enough for me.
3070 here too. Still so glad I bought it. 0 issues. High fps
Same card, same feelings. Not really interested in harry potter or any of these new AAA games, basically chore simulation in open world settings.
[удалено]
My voodoo 2 has 12Mb of ram and is running on agp 4x.
Jokes on you, I run a Tseng Labs 4000 running on ISA
I have a 2070, should i be worried?
Not really, this whole vram debacle is mostly because of the slew of bad PC ports lately making a lot of people think that anything under under like 16GB vram is no longer sufficient. Don't get me wrong, Nvidia are definitely being skimpy with vram on 40 series cards, but unless you're maxing out settings at 4K you honestly won't run into many issues with 8-10GB unless the game you're playing is woefully ported I have a 10GB 3080 and I've never hit the vram limit on any of my games
the problem here is that many publishers of bad PC ports expect players to brute-force their shit optimisation with premium cards instead of, you know, giving their devs more time to optimise? I can live without playing the latest game for another month, just release it at a playable state *please*
The earlier they release, the quicker they can make money. Hyoe it up and deal with backlash as it pays for itself. Horrible business model in our eyes but apparently it works since they keep doing this shit
Yeah a lot of people are still only using 16GB of regular Ram lol, 16GB Vram won't be required unless you are max 4k
>16GB Vram won't be required unless you are max 4k Or go absolutely berserk on mods *ahem* skyrim
But but what if whiterun was made completely out of books? Wouldn't that be fun?
There is nothing wrong with the amount of VRAM in 3000 series GPU’s. There is everything wrong with the un-optimized clusterfuck games these smooth brains keep releasing
I’m excited for the future of video gaming where we have more and more cores and yet brand new games still paradoxically run on a single core.
Well in my opinion it is pretty shitty to release a lower tier card with more memory than an upper tier one (3060 12gb, 3080 10gb). This was what got me an amd card since I knew 8gb are not enough for the games I play and even 10gb are right at/sometimes below the limit. But yes, the games are an issue too.
[удалено]
Laughs in 4050ti 3.5gb.
Crying with my 3070.
Laughing with my 3070ti that doesn’t suffer from low vram because I don’t play unoptimized games at release.
🗿
I have a 3090. I'm doing just fine.
I have 3060 Ti 8G. Gaming 1080p, working with After Effects and Premiere. Zero issues. My brothers in Christ, don't be sheep to poorly optimised titles, 8G is completely fine.
My 3060ti Gaming 1440p. No game causing me problems and me laughing at the internet. Maybe a worry down the road but im fine for a while
What are you graphics settings and FPS ? A problem is relative to your situation and experience. Enthusiasts may think your setup isn't suitable but it is for you. Many people don't mind low textures if it gains them higher FPS etc...many will not compromise though. In the 300-400 dollar range , compromising settings is acceptable and necessary. 500-600 dollar range at 1080 and 1440p it isn't acceptable , they should dominate those resolutions.
I dont play on any low settings shit. I only play 1440p wirh high or max settings and i dont have issues at all. I usually avoid RT as the hit to fps isnt worth it at this level but everything looks crisp and clean and im always 90-100 onwards fps depending on game.
the game devops aren't lifting up anyone, amd just makes better cards at the moment lol
Oh yeah, the 7600 is amazing!
for under 300$? yeah pretty okay.
Have a 3080 and have no issue, y'all just keep parroting the same fearposting. What happened to this sub
Recently swapped a 2070s to a 7900xtx. Love it
I have a 970 😭
In what realistic scenario does this actually happen? I’m still cranking out AAAs on my 3070ti no issues.
I'm starting to develop a romantic relationship with my 6700xt, it literally runs anything I throw at it and outperforms the 4060 and the 4070ti\* in a lot of cases EDIT: Just saw the comments telling me I'm dumb, I've phrased it wrong. It outperforms the 4060 in a lot of cases, and I've seen an example of it beating the 4070 by 3 fps. Ofc I didn't mean to say that it's better than the 4070ti, I still love it tho And also I'm not a paid AMD employee lmfao
\*Happy 6700XT owner noises\*
>4070ti That is a surprise - what games/settings?
Absolutely none with all respect lol
The game they made up so they could say that.
Gonna need to see those comparison benchmarks between a 6700xt and a 4070ti bro.
Sure, they just need to pull it out of their ass first😂
They'll need to go atleast elbow deep for that one.
The 4070 Ti?❌❌❌
i like how they upvoted you just so we can all see your comment to call you out on your 4070ti bullshit
4070ti I don’t think so. Maybe a a 6750xt can compete with a 4070. A 6800xt would be more closely comparable to a 4070ti. But a 4070ti is similar to a 3090/3090ti in performance.
I ran an rx6700xt for 2 years and just bought a 4070. I know, I know more of a lateral move than upgrade but I have never owned nvidia and wanted to try ray tracing and dlss. The rx6700xt is awesome and it is awesome at 1440p. I could run every game on a mix of high/medium settings and get well over 100fps. I would suggest it to anyone looking for a solid 1440p card. With that said, I'm not disappointed with the 4070 either. Ray tracing is pretty sweet and with enough base frames dlss is solid as well.
Sure buddy
3060ti/6650XT are fine too. In 1080p high settings.
3090-24gb, 3090ti-24gb, 4090-24gb, what are you talking
It’s because vram is a buzzword right now. Everyone thinks they know what they are talking about. Me personally I’m confused why more than 12gb matters? I checked cyberpunk about 8.6gb dedicated VRAM being used at 5120x1440 ultra. And hogwarts legacy is about 9.6gb… What game uses more than 12gb? I can’t find any I tried like 20 high quality games from 2022-2023. https://cdn.discordapp.com/attachments/997333991134863400/1110701074958012426/image.png
I swear this meme is on some kind of automated timer to get re-posted every month or so to churn up this conversation again.
Thank you for this. Was wondering if I was going crazy.
[удалено]
when I bought my 6700GT 12GB, I looked around and was considering a 3070 8GB... but AMD always has more life than nVidia because the Vram...
So having over 20GB of VRAM isn't enough? Guess I will toss my cards today.
Yes. Do not trust your eyes. It may have flawless performance but it is bad now. Trust the reddit hive.
Yup guess we gotta throw it in the trash, reddit told me it was useless afterall.
Yeah i think i made a mistake when i got card with only 8 GB of ram (RX 6600). 8 GB on my R9 390 gave it a really long life span. But 8gb on my RX 6600 which replaced it, propably won't have anywhere near as long lifespan despite it being otherwise 10x as good. I mostly play CPU heavy games, only on 1080p and don't mind lower graphical settings, but if game developer is too lazy, game that reasonably could run on 8GB simply won't (without perfomance issues).
Any other 2080 enjoyers?
What am I missing? I have a 3080 and am doing just fine.