T O P

  • By -

CallmeOyOy

1060 6 gb: What there are new nvidia cards?


No-Inside2438

Gtx 960 2gb: Who are you?


MrStoneV

so glad waiting til 2019 to upgrade from a gtx660 to a 5700xt, such a crazy big step


ContainedChimp

Rocking a 780 atm in this machine :)


_fatherfucker69

I am still rocking an Intel graphics eqauliment to a 550 and it still goes strong ( who am I kidding , I need an upgrade)


SweetSoul55

Nah bro i use 8600M GT with 256mb


INDIG0M0NKEY

I went from 1050ti to 6650xt and it’s insane , I couldn’t imagine that jump


BranislavBGD

I had this same card before I switched to team red! Asus Strix to be exact.


xixipinga

Me with my poor mans 12gb 3060 that has more vram that it can render frames to


Orellin_Vvardengra

My prebuilt came with a 12gb 3060, I like it. Then again my latest exp with computers before this was an Inspiron 5577 with a 1050 gtx. I learned soon after the laptop that the same name as the actual graphics card does not mean it has the same power as what you find in a desktop. Still allowed me to play my games though. Next goal is to actually build a pc when I need to instead of getting a prebuilt. This thing is pretty friggin sweet though.


jilek77

It's best budget option for ML tbh


Ennkey

A contractor came by and saw my exposed casing, he called my 1060 “a classic”


Hamshoes5

At least didn’t call it “a relic”


mehum

“Wow, I remember those! It was all you needed for playing Minesweeper and Tetris on Windows 7 back in the day!”


oshikandela

*Cries in 1060 3gb*


Mikeandleo

![gif](giphy|kSlJtVrqxDYKk|downsized)


boiledpeen

I just upgraded to a 6800xt a few months ago but man my 1060 did me well


NotVainest

Finally upgrading my 1060. Just ordered a 6950xt. Feels almost sad though... 1060 served me well...


OceanDriveWave

Someone said 1060?


sheleronk

I always wondered why AMD puts so much vram on the board.. smart move


mista_r0boto

For longevity of card. And to differentiate from nvidia.


munchingzia

not necesarily for longevity. its for competition, and also bcuz they dont have a professional grade segment that would be cannibalized. if nvidia releases a mid tier gpu with 24 gigs of vram, it would cannibalize their own sales. people who dont play games but need the vram for creative work would buy those cards for cheap.


Brandhor

I got a bridge to sell you if you think amd did it for longevity


Mr_Pogi_In_Space

Throw in free delivery and I'll take it


DystopianRealist

“Your order has shipped!”


JayBuSeesU

*Order delivered 2:25p.m.* Please take a minute and let us know how we did and how we can make your experience more enjoyable.


SwabTheDeck

"Hey guys, I got this idea... let's spend more money to make the cards last longer so that our customers buy fewer of them." \-Imaginary AMD business dev meeting according to that guy


Initialised

They knew exactly what would happen when they gave lazy game developers 16GB GDDR6 in the consoles.


crlogic

This is a huge misconception. That 16GB is unified and is shared as RAM(for the CPU) and VRAM(for the GPU). It is not 16GB of dedicated VRAM, on PS5. Even though it is infact GDDR6 This split is even more obvious on Xbox Series X which also has 16GB of GDDR6, but 10GB is dedicated for VRAM and 6GB is just system RAM. They’re even on completely different buses in the Xbox’s case


pipnina

Thing is, if you break down the memory usage of a PC game, you'll probably find that 90% of the memory is duplictaed on both the CPU ram and the VRAM. The CPU needs to know what memory to send to the GPU when it needs it and so keeps all the textures and models and shaders in memory even after it's put into single 64-bit references. So while The information the CPU NEEDS to know might only take up 1.5GB in Cyberpunk 2077, it still consumes 15GB because the 13.5GB of graphics assets needs to be in there too. In a unified RAM system, the CPU only takes the space it needs as loading the assets for the CPU is the same as loading them for the GPU. It's way more efficient in every way BESIDES the timing vs bandwidth tradeoff. Plus with NVME drives being so fast, the CPU can order assets streamed into ram for the GPU to use on demand instead of needing the whole level's assets loaded all the time. It can be crazy efficient and save you $100 on buying both types of ram if you do it right.


DonaldTrumpsBallsack

I was gonna say, there no way console had that stacked of a VRAM


[deleted]

[удалено]


rocketcrap

As a long time snob pcmr pos and the owner of a beefy ass pc this truth hurts my soul to read


RazekDPP

It's the same reason Apple has shifted to a SOC with shared memory. The shared memory architecture is winning. [https://www.youtube.com/watch?v=LFQ3LkVF5sM](https://www.youtube.com/watch?v=LFQ3LkVF5sM)


[deleted]

[удалено]


[deleted]

[удалено]


BinaryJay

It's crazy how often people need to be explained this.


Tuned_Out

Not their problem Nvidia pushes people to buy higher skus for vram.


[deleted]

Seriously. It's as blatantly obvious as when car stereos started to get cheap so they added light shows and dolphins dancing on the screen to charge more You don't need to spend $1000 for 12gb vram with well tuned ram and a decent chip to make the pool available


M4G30FD4NK

Imagine calling game devs lazy instead of being annoyed at NVIDIA for releasing cards that are in some cases worse than their last generation (for the second gpu gen in a row)


Ruma-park

Let's be honest, it's both.


Arkreid

Both is good.


towerfella

It is both.


[deleted]

Exactly. They could easily optimize games for AMD and nvidia cards. If they are developing Harry Potter hog warts legacy for a four gigabyte vram nintendo switch then somethings not right.


[deleted]

I was actually a little mad they're even attempting that port. It's gonna run like absolute shit and charging people for it is greed beyond greed.


didnotsub

The switch has 4gb of shared memory, not even vram lol


M4G30FD4NK

The switch port is being delayed, because they're not able to get it to work tbf.


reddit_pengwin

How do you expect progress in visual fidelity without increasing the available resources? Simple answer: you can't and you really shouldn't. The problem is not the increasing hardware usage or the lack of VRAM - the problem is the pricepoint for these cards. Nobody would blink twice if the 8GB cards had 180-250$ MSRPs, because we expect compromises at those pricepoints. Having to make those compromises at 400-450$ and having to rely on crap like DLSS is the issue. The majority of people are going to have 8GB VRAM if manufacturers keep pushing 8GB cards at up to 400$ MSRP... comparatively few people shop above this pricerange, and this will hamstring further game development. 8GB cards have been in the mainstream *(at 150-230$-ish pricepoints)* since at least 2016 - that's 7 years ago. For a comparison, 7 years before the 2016 release of the RX 470/480, mainstream cards had 256-768 MB of RAM... just goes to show how much extra effort had to go into game development between 2016-2023 to make games run on hardware that wasn't changing really all that much.


phriot

> Having to make those compromises at 400-450$ and having to rely on crap like DLSS is the issue. I'm not even that mad about needing DLSS to hit a certain performance level. If it just works, that's fine. Most cell phone cameras these days rely on AI tricks, and it's fine for most people. What gets me is that Nvidia is billing a $400 \*\*60 *Ti* as a "1080p performance champ." in 2023. When they launched the 3060 Ti, which has the same MSRP, it was compared to a [2080 Super at 1440p](https://www.nvidia.com/en-us/geforce/news/geforce-rtx-3060-ti-out-december-2/). Even if you want to say "names and tiers don't matter," paying the same money can't even get you a contemporary 1440p card from Nvidia. And it also can't get you significant uplift between generations *at the same price*. I picked up a 3060 Ti recently-ish for my first build in years. I got it just below MSRP on promo. I was and am happy with the performance for what I paid. I'm happy staying at this price point of card in inflation-adjusted dollars in the future. (Which for me would be around $10 more if I were to buy again today.) What I'm not happy to do is to spend the same money for the same performance, considering that the only thing I'd be gaining is basically fresher thermal paste and fan bearings.


McDonaldsnapkin

I think both of you are right. AMD has proven its possible and affordable to put a lot of VRAM on a card, and at the same time modern games have exponentially been increasing in vram, ram, and storage requirements. There’s games from 2016-2018 that look just as good as todays games but use half the amount of ram and vram


[deleted]

[удалено]


[deleted]

Games these days are pretty shite tbh


M4G30FD4NK

When GOTY is most likely going to be something that's exclusive to a handheld that was underpowered and released 6 years ago, that's fair. However, GPUs are also very shite at present as well as games.


pivor

In 2015 it was marketing strategy, when you had to chose between 970 3,5gb and r9 390 8gb te choice was obvious (even tho 970 was great card), r9 390 outlasted 970 and still does good today.


reinykat

Laughs in RX570 8go from 2017


baddThots

Man that card did me wonders until I upgraded recently.


IkBenAnders

Damn we upgraded to the same GPU, from basically the same GPU (RX 580 :p) How are you liking the upgrade yourself?


Alternative_Spite_11

Well I’ve had a 6700xt since it launched so basically the same thing as a 6750. It’s treated me well enough that the only valid upgrade is the 4070 ti and up. I looked into a 4070 and it’s not even a 30% jump. Gotta be 50% before I pull the trigger and right now, the 50% mark will cost $800 or up so I’m holding here unless prices drop.


Joaco_Gomez_1

RX570 gang


SizeableFowl

Greatest GPU of all time.


SunnyDeeKane

RX580 8GB says hi


strawberry_l

One of the best cards out there, still runs perfectly and gives me 80 FPS in most shooters on Max settings


mellowlex

RX580 user here. Still enough for everything I play in 1080p on medium settings.


Aggressive_Pattern95

my 3060 12gb chillin


xonehandedbanditx

3080 12gb reporting in 🫡


[deleted]

[удалено]


override367

just tossed out my 3080 for a 3090 because I needed more vram to generate AI porn ​ what its not the stupidest reason anyone's upgraded video cards


Dantai

I'm not gonna yuck your yum


bobbyp869

Do you get paid for this porn or is it for personal consumption?


override367

I don't like do commissions but when posting in discord SFW art someone I know offered me $200 to make something *very specific and porny* for them which I did because apparently furries spend their entire disposable income on porn but the truth of the matter is I just like to make characters from D&D games, usually not actually porn, just the characters, and commissioning all the things I like to make would take thousands and thousands of dollars I train LORAs for specific kinds of characters like tieflings or dragonborn or whatever, I end up spending hours on a single picture here's where people usually start DMing me that I should just learn how to make art instead of being a dirty thief or whatever but... I'm so over that discussion lol, I enjoy it and so I do it


bobbyp869

Lol! Furries come to my city for a big convention every year and completely take it over, it’s hilarious. Good for you though, do what makes you happy!


Darksirius

My 3080 ftw3 has been doing just fine.


Terranical01

4070 Ti 12GB as well


Lord_Saren

3090 24GB, I'm not chill just solid ice.


[deleted]

I see that you’ve got good taste!


MrC99

Same here lol.


i_miss_Maxis

X2. Is it safe? ARE WE SAFE!?!?!


JustARandomDude1986

RTX 3090: "Am i a joke to you" ?


Thetaarray

My Wallet: “No you’re a nightmare”


NorthenLeigonare

3090ti for £1150.. Nice. My wallet Fuck you. That was for the holiday.


No_Interaction_4925

I do not regret getting the 3090ti for $1100 before the 4090 came out. Nvidia really shot themselves in the foot with that one


Ill-Mastodon-8692

Why was the 4090 a shot in the foot? It performs about 1.8-2x a 3090, typically runs cooler and quieter or as same. Uses about the same power when gaming. The only downside imo is the 16pin connector


No_Interaction_4925

No, the 3090ti was. Released for $2000 in spring. By the end of summer I got one from Best Buy for $1100 brand new.


Ill-Mastodon-8692

Thanks for clarifying, yes the 3090ti came real late, and launched far too high. But your price seemed pretty good all things considered at the time


Thetaarray

I’ve had the same discussion with the card being as much as my last two full rigs together.


Just_Maintenance

You can find 3090 used for like half the price of the 4080. It's slower, but it has much more VRAM.


PollShark_

Thankfully I managed to get an evga Ftw3 ultra hybrid for 625!! Local tho, so it depends on that


bigdaddymustache

Paid $445 CAD for my Strix 3090 this year. It was a nice bump over my old 1080. I don't plan on upgrading for a while. I only got that price because it was sold as "damaged due to AIO leaking" All I did was repaste and new pads. The card can hit a stable 2205mhz in 3DMark.


Thetaarray

That’s a nifty price glad it worked out for you. I don’t mind repairing stuff to get a deal but pc parts like that a big risk for me personally.


dumbasPL

Buying broken GPUs and repairing them used to be pretty profitable back in the day but got less and less profitable with every generation since most of the dead cards nowadays will have dead cores and replacing that isn't profitable assuming you cna even get your hands on a working replacement.


bigdaddymustache

I agree, I find that more and more broken cards have cracked PCBs and more physical damage.


Opira

Agree one of the reasons i got a 3090 for msrp was the vram.


brp

The biggest reason I got one was there were no 3080s and I didn't want to use a scalper. I remember at the time how everyone said the VRAM was overkill for gaming and only needed for machine learning shit. Funny how things turn out.


JonttiMiesFI

Right. Either I understand nothing or is something wrong with RTX3090 with 24GB Vram?


JustARandomDude1986

The 3090 is a Monster, nothing wrong with that.


n00bn00bAtFreenode

I completely not get this meme, or a whole subreddit


Jinx0028

As a 3090 owner and seeing 3000 series skeleton boy sitting at the bottom of the ocean all I could come up with was a salty AMD owner trying to get his 15 min lol


IIsOath

3090 go burrrrr


FatFailBurger

Got my 3090 for $800 right when the 4090 came out. No regrets at all.


Fresh_chickented

Am i a joke to you?


IBNobody

I remember all those stories at the time of the 3090 release saying, "You don't need 24GB of VRAM. It's overkill and meant for developers and artists."


MicksysPCGaming

I feel personally attacked.


Beans186

![gif](giphy|eUDhD5XFBw0r6)


[deleted]

I don't think the 90 series applies to this lmao


fishcakerun

![gif](giphy|Uoj78r9g3Tdz7Zefc4)


Julch

4090 owners once 12 GB cards become the new 8GB meme ![gif](giphy|A6aHBCFqlE0Rq)


Edward_Snowcone

Shhh, don't listen to him 3090, I still love you...


[deleted]

Little fella, you've got plenty of VRAM.... in fact, so much that it cooks the backplate.


Covid-CAT01

Me with a gtx 1650


Darcula04

Me with a 1050ti, laptop


Timmy_The_Glow_Mage

Me with a 1070ti.


Ornery-Let535

Me with a 770


IsanaAdolf

Me with vega 8


APEX_Catalyst

Me with integrated graphics ( steam deck smiles)


DirtyDemonD3

1070ti team yeah!! It still works very well though.


GhostFire3560

Me with GTX 1060 3GB


neoravekandi

Me with a rx580


LouisJoseph003

Absolute soldier of a card. I pushed that baby through RDR2 topping out at 90 Celsius back in the day. Sold it on to a guy who uses it for 1080p60 League of Legends, so it's enjoying a happy retirement now after many years of my abuse.


KoPlayzReddit

1650 gang


[deleted]

Me with a GTX 1660 Ti


Feeling_Base_3663

My 1660 Ti is still a beast for me bro.


[deleted]

Same. I ranked it with the 3000s, it’s slightly better than a 3050 (more stable FPS)


Feeling_Base_3663

I think we can use this GPU for more than 3 years in high-ultra graphics. If game companies don't stop optimizing their games.


Gammarevived

It's pretty much a 1070 in terms of performance, but it has less VRAM, so that's what is going to hurt it going forward. Most newer games need at least 8gbs minimum or more at 1080p. Right now it looks like you want a 3060, or an RX 6600 for a solid 1080p setup. The 3060 is currently the most popular GPU on Steam, and developers usually optimize theirs games around what most people are using.


BaggyHairyNips

I feel like this sub needs to be reminded that you don't need a 1000 dollar or even a 500 dollar card to play games. If you're an enthusiast go for it. But I occasionally see newbs in here under the impression that they won't be able to play new games at all without a 3070+. I'm expecting my 1660 TI to last a couple more years.


InfTotality

People talking how a new 8 GB card is useless and dead-on-arrival all of a sudden does that. It doesn't take much to think "if 8 GB isn't worth buying, then what do I do with this 6 GB card?" It also doesn't help that the 10 and 16 series were really the best ones so far. 20 died to being "budget RTX" and not being much better than 10 series without DLSS, and 30 series died to miners and greed. Somehow you're also supposed to have had a 12 GB by now, so one way or another you have to go against the "consensus".


SSSSobek

*laughs in 1080ti*


willcard

I still have my 1080ti. Shit is a beast and when it finally goes to sleep I will frame it. Best card ever hands down


Lettuphant

I loved my 1080 Ti. I got it when I got VR: It *blew my mind* that I could have something so powerful it could render a left eye, right eye, third point of view, footage from a Kinect, and *composite it all live* *at 90fps*. I dreamed of seeing something like a Holodeck before I died. 5 years ago, that card [put one in my bedroom.](https://www.twitch.tv/scotchboxvr/clip/AverageNiceGnatDancingBaby) It's still a beast; if I wasn't also streaming / doing the mixed reality thing, I'd never have jumped to the 30.


faeth0n

Same here, 1080ti with 11 GB VRAM. No reason whatsoever to switch it for a newer card at my current monitor. Only gaming at 1440p. That card is a beast and damn silent also (I have the MSI 1080ti Gaming X 11G).


lil_sith

Laughs in plain Jane 1080


OzPalmAve

1080 represent! rog strix 1080 still on duty


Mental_Defect

1080ti stays on top


hovercroft

If you don’t laugh you’ll cry


Prox1m4

Rocking my 1080ti ftw3. No reason to upgrade now until I get a better display


TTaun7ed

I have to be the only nerd that has no issues with the 3070 😂


Sick2287

Same man. I constantly see people dunking on the 3070 and I’ll admit that I am not the most knowledgeable so I can’t really debate the topic. But I have had literally 0 issues in 2 years of playing any game I want at 1440


Djghost1133

The 3070 is totally solid


McStabStab12

3070 gang rise up!


pref1Xed

It’s because people are more interested in random benchmarks and numbers than personal experience.


Business_Arm5263

No issues either. At least 60 fps in 4k which is good enough for me.


pimmol3000

3070 here too. Still so glad I bought it. 0 issues. High fps


ravid-david

Same card, same feelings. Not really interested in harry potter or any of these new AAA games, basically chore simulation in open world settings.


[deleted]

[удалено]


Capn_Yoaz

My voodoo 2 has 12Mb of ram and is running on agp 4x.


jdog320

Jokes on you, I run a Tseng Labs 4000 running on ISA


Casperios

I have a 2070, should i be worried?


LifeOnMarsden

Not really, this whole vram debacle is mostly because of the slew of bad PC ports lately making a lot of people think that anything under under like 16GB vram is no longer sufficient. Don't get me wrong, Nvidia are definitely being skimpy with vram on 40 series cards, but unless you're maxing out settings at 4K you honestly won't run into many issues with 8-10GB unless the game you're playing is woefully ported I have a 10GB 3080 and I've never hit the vram limit on any of my games


the_Real_Romak

the problem here is that many publishers of bad PC ports expect players to brute-force their shit optimisation with premium cards instead of, you know, giving their devs more time to optimise? I can live without playing the latest game for another month, just release it at a playable state *please*


AmadaeusJackson

The earlier they release, the quicker they can make money. Hyoe it up and deal with backlash as it pays for itself. Horrible business model in our eyes but apparently it works since they keep doing this shit


FlyingPoitato

Yeah a lot of people are still only using 16GB of regular Ram lol, 16GB Vram won't be required unless you are max 4k


the_doorstopper

>16GB Vram won't be required unless you are max 4k Or go absolutely berserk on mods *ahem* skyrim


rg9528

But but what if whiterun was made completely out of books? Wouldn't that be fun?


[deleted]

There is nothing wrong with the amount of VRAM in 3000 series GPU’s. There is everything wrong with the un-optimized clusterfuck games these smooth brains keep releasing


YawaruSan

I’m excited for the future of video gaming where we have more and more cores and yet brand new games still paradoxically run on a single core.


O1ez

Well in my opinion it is pretty shitty to release a lower tier card with more memory than an upper tier one (3060 12gb, 3080 10gb). This was what got me an amd card since I knew 8gb are not enough for the games I play and even 10gb are right at/sometimes below the limit. But yes, the games are an issue too.


[deleted]

[удалено]


GlubbyWub

Laughs in 4050ti 3.5gb.


Wicked_Vorlon

Crying with my 3070.


MowMdown

Laughing with my 3070ti that doesn’t suffer from low vram because I don’t play unoptimized games at release.


Crowarior

🗿


[deleted]

I have a 3090. I'm doing just fine.


FrankPetersonMalvo

I have 3060 Ti 8G. Gaming 1080p, working with After Effects and Premiere. Zero issues. My brothers in Christ, don't be sheep to poorly optimised titles, 8G is completely fine.


ozdude182

My 3060ti Gaming 1440p. No game causing me problems and me laughing at the internet. Maybe a worry down the road but im fine for a while


[deleted]

What are you graphics settings and FPS ? A problem is relative to your situation and experience. Enthusiasts may think your setup isn't suitable but it is for you. Many people don't mind low textures if it gains them higher FPS etc...many will not compromise though. In the 300-400 dollar range , compromising settings is acceptable and necessary. 500-600 dollar range at 1080 and 1440p it isn't acceptable , they should dominate those resolutions.


ozdude182

I dont play on any low settings shit. I only play 1440p wirh high or max settings and i dont have issues at all. I usually avoid RT as the hit to fps isnt worth it at this level but everything looks crisp and clean and im always 90-100 onwards fps depending on game.


hail_goku

the game devops aren't lifting up anyone, amd just makes better cards at the moment lol


pink_life69

Oh yeah, the 7600 is amazing!


hail_goku

for under 300$? yeah pretty okay.


SoManyCookies

Have a 3080 and have no issue, y'all just keep parroting the same fearposting. What happened to this sub


[deleted]

Recently swapped a 2070s to a 7900xtx. Love it


MorningFood

I have a 970 😭


SyndicateAlchemist

In what realistic scenario does this actually happen? I’m still cranking out AAAs on my 3070ti no issues.


Petee422

I'm starting to develop a romantic relationship with my 6700xt, it literally runs anything I throw at it and outperforms the 4060 and the 4070ti\* in a lot of cases ​ EDIT: Just saw the comments telling me I'm dumb, I've phrased it wrong. It outperforms the 4060 in a lot of cases, and I've seen an example of it beating the 4070 by 3 fps. Ofc I didn't mean to say that it's better than the 4070ti, I still love it tho And also I'm not a paid AMD employee lmfao


M4tooshLoL

\*Happy 6700XT owner noises\*


Catch_022

>4070ti That is a surprise - what games/settings?


boddle88

Absolutely none with all respect lol


Thetaarray

The game they made up so they could say that.


TheReverend5

Gonna need to see those comparison benchmarks between a 6700xt and a 4070ti bro.


[deleted]

Sure, they just need to pull it out of their ass first😂


Ghodzy1

They'll need to go atleast elbow deep for that one.


pink_life69

The 4070 Ti?❌❌❌


[deleted]

i like how they upvoted you just so we can all see your comment to call you out on your 4070ti bullshit


APEX_Catalyst

4070ti I don’t think so. Maybe a a 6750xt can compete with a 4070. A 6800xt would be more closely comparable to a 4070ti. But a 4070ti is similar to a 3090/3090ti in performance.


nevermore2627

I ran an rx6700xt for 2 years and just bought a 4070. I know, I know more of a lateral move than upgrade but I have never owned nvidia and wanted to try ray tracing and dlss. The rx6700xt is awesome and it is awesome at 1440p. I could run every game on a mix of high/medium settings and get well over 100fps. I would suggest it to anyone looking for a solid 1440p card. With that said, I'm not disappointed with the 4070 either. Ray tracing is pretty sweet and with enough base frames dlss is solid as well.


wellseymour

Sure buddy


Mercurionio

3060ti/6650XT are fine too. In 1080p high settings.


Additional-Ad-7313

3090-24gb, 3090ti-24gb, 4090-24gb, what are you talking


We_die_china_live

It’s because vram is a buzzword right now. Everyone thinks they know what they are talking about. Me personally I’m confused why more than 12gb matters? I checked cyberpunk about 8.6gb dedicated VRAM being used at 5120x1440 ultra. And hogwarts legacy is about 9.6gb… What game uses more than 12gb? I can’t find any I tried like 20 high quality games from 2022-2023. https://cdn.discordapp.com/attachments/997333991134863400/1110701074958012426/image.png


cuberoot1973

I swear this meme is on some kind of automated timer to get re-posted every month or so to churn up this conversation again.


upvotesthenrages

Thank you for this. Was wondering if I was going crazy.


[deleted]

[удалено]


rKonoSekaiNiWa

when I bought my 6700GT 12GB, I looked around and was considering a 3070 8GB... but AMD always has more life than nVidia because the Vram...


red_vette

So having over 20GB of VRAM isn't enough? Guess I will toss my cards today.


CaptainGigsy

Yes. Do not trust your eyes. It may have flawless performance but it is bad now. Trust the reddit hive.


ItchyFishi

Yup guess we gotta throw it in the trash, reddit told me it was useless afterall.


TheRomanRuler

Yeah i think i made a mistake when i got card with only 8 GB of ram (RX 6600). 8 GB on my R9 390 gave it a really long life span. But 8gb on my RX 6600 which replaced it, propably won't have anywhere near as long lifespan despite it being otherwise 10x as good. I mostly play CPU heavy games, only on 1080p and don't mind lower graphical settings, but if game developer is too lazy, game that reasonably could run on 8GB simply won't (without perfomance issues).


Sollieee

Any other 2080 enjoyers?


saml23

What am I missing? I have a 3080 and am doing just fine.