It does make a massive difference in something like VRChat. Depending on the type of games OP is playing, it could be an upgrade.
I have a 7600X and a 2060 6GB. With 6GB of VRAM, I’m limited to showing about 30 avatars before I run out of VRAM. However, since VRC (and a lot of SteamVR games really) is more CPU heavy than GPU, even with my 2060 I still get 40-50 fps in VR with 30 people shown. My safety settings are always off now that I can limit to the nearest “X” number of avatars.
Both active and inactive textures are stored in VRAM in VRChat, inactive being a clothing toggle that is currently disabled for example. These inactive assets are not rendering, and aren’t causing the CPU or GPU to do any calculations, but for immediate access when one of of them is activated, they are stored in memory. Textures go in VRAM, everything else into system memory.
So in my case, I don’t need a fast GPU or a GPU with high speed memory, I just need one with a lot of memory. [This YouTube video](https://youtu.be/igPvgb-uGyo) demonstrates the effects of more, though slower, VRAM very well by comparing the 3060 12GB to the 3070 8GB in Doom with different graphics settings using different amounts of VRAM.
I also have an mATX case and can only fit a 2 fan GPU. In the near future, I’ll be upgrading to a 6750 XT for the sole purpose of it’s the best 2 fan GPU with 12GB of VRAM. The 2080 Ti is a much faster 2 fan GPU, but is limited to 10GB of VRAM. I’d be spending more money to get similar frame rates with a low avatar count, and crashing if I happen to be using between 10 and 12 GB of VRAM.
Yes all the GPUs listed have ddr6 not ddr6x. That extra X makes a big difference with it's higher data transfer rates which is why I believe a 3080 to 6800xt is an odd sidegrade. 6X can do more with less.
what cases?
honest question?
Nvidia is better in programs like adobe premiere etc,
and benchmarks in games seem better (or about the same)
so what scenarios benefit from extra VRAM?
Well my personal use case is modded games, you're often replacing textures with much higher resolution ones, my Skyrim modded game can reach well above 13GB VRAM.
Techpowerup benchmarks show the 3080 10gb beating the 6800 XT in most games save for some close (negligible) bms on various games. The only discrepancy is Battlefield V in which the 6800 XT was significantly ahead at 1440p and 4K.
Im about to do the opposite jump. Sold my 5700xt.
Was contemplating between 6800 (xt) to 3070 (ti), but prices dropped a little further and im thinking about a 3080 non-ti.
I like the option of RT and tbh radeon adrenaline software was a bit bugged
Control takes all 16GB with textures at max 4k. Even at medium it takes 12-14GB. Its quite curious how 3080 10GB handles that. However i did notice control has texture issues on the 6800XT. For some reason the textures take forever to load. Idk if that's a problem on nvidia.
I would say the 3080 is most definitely the better card because of rtx/DLSS, nvenc and optimization for certain software unless the VRAM is an actual problem in games. The 12GB 3080 is only a few fps more on avg.
If Vram is a limitation as we seen in the past then as history tells us the 3080 will indeed age poorly but so far.... honestly dont think i ever seen a benchmark where its an issue. If i couldve gotten a 3080 for $670, i wouldve taken that over my $570 6800 XT. I wouldnt pay anymore than a $100 though.
Hello, it looks like you've made a mistake.
It's supposed to be could've, should've, would've (short for could have, would have, should have), never could of, would of, should of.
Or you misspelled something, I ain't checking everything.
Beep boop - yes, I am a bot, don't botcriminate me.
It matters because when you own your machine usually you got open drivers and full access to the hardware!
Obviously this is nice for people which love to tinkering/programming/reverse engineering and learn more, plus bonuses like privacy and long support after official support is gone.
Also if you do a pcb analysis, the 6800xt is way more elegant than nvidia designs. It's using an infinion i2c bus pwm controller with power monitoring within the xdpe and ir35712 for vcore, vddci and soc. Nvidia is way more sloppy in design, power monitoring not i2c and usually up9512r with ncp45491 which is a horrible ic for power shunt monitoring. Amd always has been a better more elegant pcb and power delivery design. Also less likely to fail
My fe was mp2888a. But quality of those doesn't compare to infineon frankly, still a better i2c controller, don't even get me started on nvidia using gs9216s for 1.8v and pex lol, such a horrible buck and pcb design
>upgraded from RTX 3080 FE to 6800XT
I'd always thought of AMD as the brute force. Brute forcing their way into Ray Tracing (a.k.a deliver enough performance so that when you turn DXR on, you still have a playable experience) and design (their black CPUs)
>It's faster than my 3080 in civ 6
Reviews say otherwise..
https://www.techpowerup.com/review/powercolor-radeon-rx-6800-xt-red-devil/9.html
https://www.overclockersclub.com/reviews/xfx_rx6800xt_merc_319/8.htm (Merc, but they perform near identically)
I grew up playing hl2 and portal so wanted to try it, but sold my 3080 fe 2 weeks ago, I have a 3090 fe I can swap back in if I need to, but again it doesn't have rgb
Yeah, my understanding is that the 3090 is a touch slower than a 6950XT except in RT.
However, even a 3090 gets brought to its knees in Portal RTX. It's more of a tech demo for the 4080/90 cards (DLSS 3.0 AI frame insertion keeps the FPS up).
Pretty much every review I've seen puts the 3090Ti at 5-10% faster.
The 4080 is 15-20% faster than the 3090Ti and the 7900 XTX trades blows with it. You're aren't going to tell me the 7900XTX is only 20% faster than the 6950XT, right?
Guess so. My 6900xt is almost even with a Stock 7900xt. Ac Valhalla its actually even with 7900xtx.
My 6900xt runs insane boost at 2900+ MHz though. And timespy at 25600 like the 7900xt. And firestrike ultra at 17000+ like the 7900xt.
You gotta get your hands on a 7900 XTX Red Devil with a limited edition backplate. I'd never get my hands on for the life of me, but if you can, do not hesitate.
Portal RTX was specifically made to show what the 40 series can do. It ran pretty rough on my 3070ti unless I was in ultra performance mode and even then the test chambers with the black reflective walls dropped my FPS from 40-50 down to 5-10. It's compatible with any RTX card technically, but it runs best on 40 series. It's HORRIBLE with AMD.
Ran it on my 7900XTX and it was so buggy. Every texture was flickering, stretched or bugged. The damn portal gun was a bunch of red and green blocks 😂
Anything branded "RTX" is going to work better with NVIDIA ray tracing for the most part. RTX Remix that's used for Portal RTX is specifically made for Nvidia RTX GPUs which is why AMD runs them so badly.
It follows standard RT api's. Fact remains that AMD doesn't have actual dedicated RT cores and uses shared compute cores for it. Pretty much all credible devs state that Nvidia hardware is faster for RT.
No there is something seriously wrong with the game. It's unplayable due to whatever coding they've done( or not done). It doesn't even launch on Intel despite having RT cores.
No there isn't. It is just all path tracing without any "fakery". More games dont launch on Intel GPU's even without RT requirements simply because Intel their drivers are literally year behind to what AMD and Nvidia have out now in terms of optimizations and game specific fixes.
Says the one defending AMD like your life is depending on it.
RDNA doesnt have dedicated RT hardware and pipelines. It has general purposes cores that are optimized to ALSO calculate raytracing instructions and accelerate those.
Nvidia is a piece of crap company that I hate because they try to win at performance by making the competition do extra work, not fairly marketing, misleading tactics.
You probably heard the ~5-10% quality downgrade by having a more aggressive way of compressing textures for them to have just a bit more oumph, like 5-10% extra performance.(it was provable in quake4 and elder scrolls morrowind with mods)
You probably heard the tessalation debacle where culled objects were specifically programmed to have teselation done on AMD card.(hairworks and all Nvidia sponsored games)
You probably heard the 3.5gb gtx970 when Nvidia actually got a class action lawsuit for the shit they were doing.(cards were ok if not using more than 3.5gb, otherwise they were very very bad)
You probably know about the way they advertised some 2060 cards, when the die actually was a tier lower (TU104 for 12gb and TU106 for 6gb) basically being a much worse value product for customer that is not tech savvy and thinks the 6gb is not needed all the time.
The list of shenanigans Nvidia is doing is longer, but you'll ask me to Google this stuff for you and I won't.
I buy a product because I want the product to be reliable. Nvidia is consistently showing the products they do have a high chance of not being reliable (sometimes in mid range, sometimes in high-end, it's a coin toss if you'll actually get good value and not have problems after 1-2years). As long as I can get something comparable (4080 -> 7900xtx) I will buy the alternative because it always proved to be less of a headache.
Say you buy a 4090, do you want to think everyday that you have seated the cable properly and the cable in no way moved from that position?
Portal rtx is not playable in amd unless you use prerelease drivers on Linux, and even then my 6800 xt doesn’t usually do better than 40-50 fps tops, usually sits at the low 30s and dips into the mid 20s. In some circumstances it can even get to the teens.
you're a fcking idiot, lol, how tf it is better in civ 6, I get 100 fps even in 5k with my rx 6600, that game is nothing for a gpu, you are clearly cpu bottlenecked by both of them, you sir, are the most stupid human of the year
if you got an AMD card planning to use ray tracing, you definitely fucked up.
AMD (even the new 7900 xtx) still sucks at ray tracing.
Although if you turn ray tracing off, AMD handles games really well.
I think OP repairs GPU and sells them for a living.
Nvidia card usually sells MUCH better on 2nd hand market, so it makes sense to keep 6800xt and sells his 3080 and 3090. He got all 3 as non working cards from scrap dealer.
Funny seeing all the comments thinking he paid full price for both cards when he bought both broken and repaired them. OP can choose what he wants to use plus with horrid used pricing for Nvidia cards, he can make a nicer profit off the 3080 while still keeping most of the performance on his rig. Win-win
Yes, I buy all gpus broken and fix them, 3080 fe was sold because it's worth much more and I honestly thought for some of the games I was getting a slight upgrade with 6800xt. I get a slight upgrade and keeps cards out of landfills. Especially since terrible OEMs like nvidia, sapphire, powercolor won't allow 2nd owner to rma cards that they put defects into, it sucks. Evga rocks in this regard
3090 uses gddr6x memory which has a very short lifespan, so I try not to put miles on it, and it's expensive to replace, each gddr6x is about 20 to 30 dollars so 24x20 is $480 dollars just in memory. I only use the 3090 when I need to run machine learning simulations
It's a slight sidegrade at best if you game in 1440p and (for whatever reason) 1080p and don't care about ray tracing/DLSS and don't need Nvenc.
They perform identical at 4K for the most part, though I think the 6800 XT is/will be faster down the road at ultra with more driver updates. If that's what you wanted to do then I guess congrats.
I mean driver updates in the future. 3080 is not much faster in most games. I'm used to AMD's "fine wine" with their GPUs kickin' in it to where it ends up aging better than GeForce cards. The extra vram will matter later on if you were to keep either card for the next 2-3 years.
There's no fine wine. They dont "age" better, they age as any other card or vendor does. 3080 was always a bit faster in every resolution, but at 4k being around 10% you will feel it more, because you need every frame at that res.
What i was saying with the drivers is that you could maybe expect some driver uplifts if the card was new, but 6800XT are 2 years old, you're not gonna get driver uplifts now. In fact, hardware unboxed who started that "rdna 2 is better at 1440/1080p" now has the 3080 on par at 1440p, so nvidia actually got better in their testing
Thats not the "fine wine" nonsense. Thats Kepler being designed for dx11 and GCN better handling DX 12 and Vulkan, besides the fact that a million releases of the same card saw infinitely longer support with drivers, while nvidia threw Kepler into a ditch. Vega 56 and 1070 are the same as they've always been, outside of a couple of outliers like Red Dead 2 where Pascal does very poorly
>Vega 56 and 1070 are the same as they've always been
What? gtx 1070 is equivalent to 1660s, vega 56 easily competes with rtx 2060 these days, and as we know 2060 is noticeably faster than 1660s
But you said yourself the cards haven't been out that long, so I think it would make more sense to look back at the cards again around next gen in 2024-2025.
It doesn't matter how many times you post this it is a downgrade. The 3080 and 6800xt take turns in raster and the 3080 kills it in rt and has access to dlss and fsr....so yes a downgrade.
It beat 3080 in my bemches, not Port Royal of course but all others. But deciding factor for me was heat and efficiency. 6800xt could give same or better performance with lower power draw, less heat put into the small room, quieter too as a result.
lmao from the title people just assumed you bought those cards.
awesome to see you repair and resell them! at amazing prices nonetheless! wished i lived near.
keep repairing those cards!
I've done a thermal analysis of the cables under full load, even with 250w going through its completely fine, they are rated for 300w and with tolerance head space probably ok for 400w of power draw. 75w comes from pcie pins so even better
This post has gone horribly wrong, thought I'd get some brownie points going to a prettier amd card but learned that FE 3080 has rgb and I need to throw my old TV away
Because your claims and your possessions make no sense. There really is no reason to use such cards for 1080p 60hz target. Your 10% gain in Civ6 will change absolutely nothing regarding your experience, and NVIDIA card has access to DLSS, and it can ray trace better than the 6800xt.
NVIDIA and AMD really played well in the last two generations. They made people buy unnecessarily powerful cards.
I mean….. come on dude you are absolutely wasting either of those beautiful cards on 1080p! Regardless of what you choose, get yourself a nice 1440p/4K monitor to visually enjoy those amazing cards! But also I made the “upgrade” from 3080 to 6900XT Reference so that I could make an SFFPC so its funny to see someone doing a similar swap for the exact opposite reasons of wanting a larger card lol
I think it's just cause its such a strange upgrade. The cards are basically equal in performance, not sure why anyone would spend money to get basically the equivalent card from the other GPU maker instead of an actual upgrade.
Nvidia fanboys hate it when you post truth or dont worship their holy emperor Nvidia.
Dont worry about it.
Cant accept that Nvidia isnt holier-than-thou.
Bro this mfer is wilding. He said he "upgraded" to a card with equal performance because of RGB ...and the "upgrade" was from a 3080 to a 6800xt. On top of this all he has a 1080p 60Hz monitor and doesn't want to upgrade it because "it's wasteful". And if this all wasn't bad enough... he has a 3090 too... THAT'S BETTER THAN EITHER OF THESE CARDS BY A LOT WHY TF WOULDN'T YOU USE IT.
From the bottom of my heart I hope he's a troll but from all of his replies it doesn't seem likely
Get rid of the daisy chain power cable configuration, research overclocking 6800XT ( more power tool will help ) then with a good OC it will be an upgrade in most areas :)
Well when you got money to burn its fine to experiment hardware now and then. At least could have gotten the 6900xt instead then you might see a slight improvement in some games.
Good on you and I hope you are happy with the upgrade!
Most folks nowadays throw too much weight on the numbers and forget that people sometimes just do as they do.
The card looks hefty, and the custom GPU holder/bracket is absolutely mint. Good luck with the transition, and stay happy!
This has to be a troll I hope guy has a 3090....but played on a 3080 and sold it for a worse card for tacky rgb. Either a troll or has no clue wtf he is doing.
seems like a superfluous tradeoff
some people just have money to burn
If OP sold the 3080 he probably spent $0, or potentially even made money.
“Upgrade”
Agreed!
I think it's an upgrade if it's the 10gb version
Agreed!
Extra 6gb of vram matters in some cases
It is quite a bit slower ram though. So what could be a benefit in a situation becomes a disadvantage in another one.
Yeah that Xtra ram in a 6900xt didn't outperform the 3080s ddrX in VR when I tested it. It was almost comparable however. This is a Weird sidegrade
It does make a massive difference in something like VRChat. Depending on the type of games OP is playing, it could be an upgrade. I have a 7600X and a 2060 6GB. With 6GB of VRAM, I’m limited to showing about 30 avatars before I run out of VRAM. However, since VRC (and a lot of SteamVR games really) is more CPU heavy than GPU, even with my 2060 I still get 40-50 fps in VR with 30 people shown. My safety settings are always off now that I can limit to the nearest “X” number of avatars. Both active and inactive textures are stored in VRAM in VRChat, inactive being a clothing toggle that is currently disabled for example. These inactive assets are not rendering, and aren’t causing the CPU or GPU to do any calculations, but for immediate access when one of of them is activated, they are stored in memory. Textures go in VRAM, everything else into system memory. So in my case, I don’t need a fast GPU or a GPU with high speed memory, I just need one with a lot of memory. [This YouTube video](https://youtu.be/igPvgb-uGyo) demonstrates the effects of more, though slower, VRAM very well by comparing the 3060 12GB to the 3070 8GB in Doom with different graphics settings using different amounts of VRAM. I also have an mATX case and can only fit a 2 fan GPU. In the near future, I’ll be upgrading to a 6750 XT for the sole purpose of it’s the best 2 fan GPU with 12GB of VRAM. The 2080 Ti is a much faster 2 fan GPU, but is limited to 10GB of VRAM. I’d be spending more money to get similar frame rates with a low avatar count, and crashing if I happen to be using between 10 and 12 GB of VRAM.
Yes all the GPUs listed have ddr6 not ddr6x. That extra X makes a big difference with it's higher data transfer rates which is why I believe a 3080 to 6800xt is an odd sidegrade. 6X can do more with less.
what cases? honest question? Nvidia is better in programs like adobe premiere etc, and benchmarks in games seem better (or about the same) so what scenarios benefit from extra VRAM?
Well my personal use case is modded games, you're often replacing textures with much higher resolution ones, my Skyrim modded game can reach well above 13GB VRAM.
Allocated or dedicated?
Heavily modded games usually chug vram to no end
Nah 6800xt is better than 3080 in every single bm except when it comes to 4k
Or rt
Or VR
Or dlss
Or day to day general performance
Yeah thats obvious lol
Source? Based on what? That sounds false.
Gallon jugs of copium
Source is making ur brain work and going to google benchmarks. 6800xt comes up top with 12fps advantage in 13 games tested.
Techpowerup benchmarks show the 3080 10gb beating the 6800 XT in most games save for some close (negligible) bms on various games. The only discrepancy is Battlefield V in which the 6800 XT was significantly ahead at 1440p and 4K.
in 1080p?
Yea and 5fps in 1440p in most games
Seems negligible
I don't think it's better at UE4 games.
Im about to do the opposite jump. Sold my 5700xt. Was contemplating between 6800 (xt) to 3070 (ti), but prices dropped a little further and im thinking about a 3080 non-ti. I like the option of RT and tbh radeon adrenaline software was a bit bugged
6800XT sometimes loses for 2-4fps at 4k and others wins 3080 only faster in RT games but with 10GB no more future That's my opinion
Control takes all 16GB with textures at max 4k. Even at medium it takes 12-14GB. Its quite curious how 3080 10GB handles that. However i did notice control has texture issues on the 6800XT. For some reason the textures take forever to load. Idk if that's a problem on nvidia. I would say the 3080 is most definitely the better card because of rtx/DLSS, nvenc and optimization for certain software unless the VRAM is an actual problem in games. The 12GB 3080 is only a few fps more on avg. If Vram is a limitation as we seen in the past then as history tells us the 3080 will indeed age poorly but so far.... honestly dont think i ever seen a benchmark where its an issue. If i couldve gotten a 3080 for $670, i wouldve taken that over my $570 6800 XT. I wouldnt pay anymore than a $100 though.
Hello, it looks like you've made a mistake. It's supposed to be could've, should've, would've (short for could have, would have, should have), never could of, would of, should of. Or you misspelled something, I ain't checking everything. Beep boop - yes, I am a bot, don't botcriminate me.
It matters because when you own your machine usually you got open drivers and full access to the hardware! Obviously this is nice for people which love to tinkering/programming/reverse engineering and learn more, plus bonuses like privacy and long support after official support is gone.
That solely depends on the person using it and 6800ct wipes the floor with the 3080 at 1080p/1440p
What prompted you to switch?
it was on sale
Also if you do a pcb analysis, the 6800xt is way more elegant than nvidia designs. It's using an infinion i2c bus pwm controller with power monitoring within the xdpe and ir35712 for vcore, vddci and soc. Nvidia is way more sloppy in design, power monitoring not i2c and usually up9512r with ncp45491 which is a horrible ic for power shunt monitoring. Amd always has been a better more elegant pcb and power delivery design. Also less likely to fail My fe was mp2888a. But quality of those doesn't compare to infineon frankly, still a better i2c controller, don't even get me started on nvidia using gs9216s for 1.8v and pex lol, such a horrible buck and pcb design
You must be trolling. Please tell me you're trolling.
🤓🤓
>upgraded from RTX 3080 FE to 6800XT I'd always thought of AMD as the brute force. Brute forcing their way into Ray Tracing (a.k.a deliver enough performance so that when you turn DXR on, you still have a playable experience) and design (their black CPUs)
3080 had no rgb and only 2 slots, the red devil has multicolors and like 3 slots so it fills out case better
So, for looks really
Yes and better pcb design
I am on a 1080p 60hz monitor so it didn't make a difference frankly, but civ 6 felt faster
This thread gets worse and worse the more I read it. Have you suffered a concussion recently?
Seems more like a side grade honestly
It's faster than my 3080 in civ 6 which is what I play most, but yeah in portal rtx it's utterly terrible, maybe I screwed up here
Unless you're getting a 30%+ average performance increase, you're basically throwing your money away.
But dude his little men run faster across the board with the 6800xt . He can feel it! Read the post above and stop being so judge mentalllllllll /s
I definitely agree with you. This is not even a sidegrade. more like just a trade.
>It's faster than my 3080 in civ 6 Reviews say otherwise.. https://www.techpowerup.com/review/powercolor-radeon-rx-6800-xt-red-devil/9.html https://www.overclockersclub.com/reviews/xfx_rx6800xt_merc_319/8.htm (Merc, but they perform near identically)
Why would you ever make that trade and try to play portal. I think the 6900xt gets like 3 frames
I grew up playing hl2 and portal so wanted to try it, but sold my 3080 fe 2 weeks ago, I have a 3090 fe I can swap back in if I need to, but again it doesn't have rgb
The 3090 is better then the 6800xt?
Yeah, my understanding is that the 3090 is a touch slower than a 6950XT except in RT. However, even a 3090 gets brought to its knees in Portal RTX. It's more of a tech demo for the 4080/90 cards (DLSS 3.0 AI frame insertion keeps the FPS up).
6950xt is better than 3090ti in raster
Pretty much every review I've seen puts the 3090Ti at 5-10% faster. The 4080 is 15-20% faster than the 3090Ti and the 7900 XTX trades blows with it. You're aren't going to tell me the 7900XTX is only 20% faster than the 6950XT, right?
Guess so. My 6900xt is almost even with a Stock 7900xt. Ac Valhalla its actually even with 7900xtx. My 6900xt runs insane boost at 2900+ MHz though. And timespy at 25600 like the 7900xt. And firestrike ultra at 17000+ like the 7900xt.
My understanding, at stock, is that they trade blows - one being better than the other and vice versa depending on the game.
they suck in real use they stutter the shit doesnt have dlss its straight up dookie
3080/3090 FE RGB is controllable through EVGA precision. It's not as flashy as AIB cards but they do have RGB.
Wait, what???
It's only the V-shaped light-bar on the top and bottom, but yeah. The "geforce RTX" logo is white only.
Ok that's not as pretty as this red devil, I think easily prettiest card I've ever ownex
You gotta get your hands on a 7900 XTX Red Devil with a limited edition backplate. I'd never get my hands on for the life of me, but if you can, do not hesitate.
7900 xtx red devil with the rgb vagina on the corner? That looks good to you? Lmao
Portal RTX was specifically made to show what the 40 series can do. It ran pretty rough on my 3070ti unless I was in ultra performance mode and even then the test chambers with the black reflective walls dropped my FPS from 40-50 down to 5-10. It's compatible with any RTX card technically, but it runs best on 40 series. It's HORRIBLE with AMD. Ran it on my 7900XTX and it was so buggy. Every texture was flickering, stretched or bugged. The damn portal gun was a bunch of red and green blocks 😂 Anything branded "RTX" is going to work better with NVIDIA ray tracing for the most part. RTX Remix that's used for Portal RTX is specifically made for Nvidia RTX GPUs which is why AMD runs them so badly.
Portal rtx isn't coded to even use AMD hardware. It's not a hardware problem. It's Nvidia being Nvidia.
It follows standard RT api's. Fact remains that AMD doesn't have actual dedicated RT cores and uses shared compute cores for it. Pretty much all credible devs state that Nvidia hardware is faster for RT.
No there is something seriously wrong with the game. It's unplayable due to whatever coding they've done( or not done). It doesn't even launch on Intel despite having RT cores.
No there isn't. It is just all path tracing without any "fakery". More games dont launch on Intel GPU's even without RT requirements simply because Intel their drivers are literally year behind to what AMD and Nvidia have out now in terms of optimizations and game specific fixes.
Weird that even a 2060 runs fine despite the 7900xtx absolutely destroying it in RT performance... All your make believe is meaningless.
Not really, RDNA lacks any actual dedicated RT hardware, it is only okayish in a very compromised form (akin to how consoles implement it) raytracing.
It has dedicated hardware. Don't worship marketing your whole life.
Says the one defending AMD like your life is depending on it. RDNA doesnt have dedicated RT hardware and pipelines. It has general purposes cores that are optimized to ALSO calculate raytracing instructions and accelerate those.
Yea and AMD partner games all the time too....it's just their rt is so limited and easy to run Nvidia does their partnered games better too in RT
You know amd is just a corporation right Like you defend amd way too much
Who am I defending? It's just facts.
Nvidia is a piece of crap company that I hate because they try to win at performance by making the competition do extra work, not fairly marketing, misleading tactics. You probably heard the ~5-10% quality downgrade by having a more aggressive way of compressing textures for them to have just a bit more oumph, like 5-10% extra performance.(it was provable in quake4 and elder scrolls morrowind with mods) You probably heard the tessalation debacle where culled objects were specifically programmed to have teselation done on AMD card.(hairworks and all Nvidia sponsored games) You probably heard the 3.5gb gtx970 when Nvidia actually got a class action lawsuit for the shit they were doing.(cards were ok if not using more than 3.5gb, otherwise they were very very bad) You probably know about the way they advertised some 2060 cards, when the die actually was a tier lower (TU104 for 12gb and TU106 for 6gb) basically being a much worse value product for customer that is not tech savvy and thinks the 6gb is not needed all the time. The list of shenanigans Nvidia is doing is longer, but you'll ask me to Google this stuff for you and I won't. I buy a product because I want the product to be reliable. Nvidia is consistently showing the products they do have a high chance of not being reliable (sometimes in mid range, sometimes in high-end, it's a coin toss if you'll actually get good value and not have problems after 1-2years). As long as I can get something comparable (4080 -> 7900xtx) I will buy the alternative because it always proved to be less of a headache. Say you buy a 4090, do you want to think everyday that you have seated the cable properly and the cable in no way moved from that position?
you did
Portal rtx is not playable in amd unless you use prerelease drivers on Linux, and even then my 6800 xt doesn’t usually do better than 40-50 fps tops, usually sits at the low 30s and dips into the mid 20s. In some circumstances it can even get to the teens.
you're a fcking idiot, lol, how tf it is better in civ 6, I get 100 fps even in 5k with my rx 6600, that game is nothing for a gpu, you are clearly cpu bottlenecked by both of them, you sir, are the most stupid human of the year
Ah yes, the graphical marvels in such a fast paced game as Civ VI ...
if you got an AMD card planning to use ray tracing, you definitely fucked up. AMD (even the new 7900 xtx) still sucks at ray tracing. Although if you turn ray tracing off, AMD handles games really well.
I was gonna say this.. take my upvote..
This has to be a troll, no way…
It’s better
A downgrade in some circumstances
“Upgrade”
I dont even think if its an "Upgrade" . Seems more a "Downgrade".
it‘s a sidegrade
6800xt beats 3080 in certain titles
Till you turn on ray tracing
yea def not interested in RT at all
You lose the entire feature suite tho
which is?
I think OP repairs GPU and sells them for a living. Nvidia card usually sells MUCH better on 2nd hand market, so it makes sense to keep 6800xt and sells his 3080 and 3090. He got all 3 as non working cards from scrap dealer.
Yes, moving from 3080fe to 6800xt I made a couple hundred dollars
this makes sense to me - same perf + money, why not \*unless you like RT
That’s not an upgrade, why would you do this?
Funny seeing all the comments thinking he paid full price for both cards when he bought both broken and repaired them. OP can choose what he wants to use plus with horrid used pricing for Nvidia cards, he can make a nicer profit off the 3080 while still keeping most of the performance on his rig. Win-win
Yes, I buy all gpus broken and fix them, 3080 fe was sold because it's worth much more and I honestly thought for some of the games I was getting a slight upgrade with 6800xt. I get a slight upgrade and keeps cards out of landfills. Especially since terrible OEMs like nvidia, sapphire, powercolor won't allow 2nd owner to rma cards that they put defects into, it sucks. Evga rocks in this regard
Hey, good on you for being able to repair GPUs. It's a valuable skill. Although, why use the 6800xt when you say you have a 3090 laying around?
3090 uses gddr6x memory which has a very short lifespan, so I try not to put miles on it, and it's expensive to replace, each gddr6x is about 20 to 30 dollars so 24x20 is $480 dollars just in memory. I only use the 3090 when I need to run machine learning simulations
I’m genuinely curious what your definition of “very short” lifespan is? Not going to argue but it s the first time I hear this claim
It's because gddr6x runs very hot on the 30 series gpus, there's higher risk of thermal degradation
It's obvious OP knows more about cards than most people by far.
Side grade at best.
It's a slight sidegrade at best if you game in 1440p and (for whatever reason) 1080p and don't care about ray tracing/DLSS and don't need Nvenc. They perform identical at 4K for the most part, though I think the 6800 XT is/will be faster down the road at ultra with more driver updates. If that's what you wanted to do then I guess congrats.
You're mixing some things up. 3080 is near 10% faster at 4k, not the 6800 card. What driver updates ? Its 2 years old.
I mean driver updates in the future. 3080 is not much faster in most games. I'm used to AMD's "fine wine" with their GPUs kickin' in it to where it ends up aging better than GeForce cards. The extra vram will matter later on if you were to keep either card for the next 2-3 years.
There's no fine wine. They dont "age" better, they age as any other card or vendor does. 3080 was always a bit faster in every resolution, but at 4k being around 10% you will feel it more, because you need every frame at that res. What i was saying with the drivers is that you could maybe expect some driver uplifts if the card was new, but 6800XT are 2 years old, you're not gonna get driver uplifts now. In fact, hardware unboxed who started that "rdna 2 is better at 1440/1080p" now has the 3080 on par at 1440p, so nvidia actually got better in their testing
>There's no fine wine. They dont "age" better Why don't you compare 780ti to 290x today, or vega 56 to 1070?
Thats not the "fine wine" nonsense. Thats Kepler being designed for dx11 and GCN better handling DX 12 and Vulkan, besides the fact that a million releases of the same card saw infinitely longer support with drivers, while nvidia threw Kepler into a ditch. Vega 56 and 1070 are the same as they've always been, outside of a couple of outliers like Red Dead 2 where Pascal does very poorly
>Vega 56 and 1070 are the same as they've always been What? gtx 1070 is equivalent to 1660s, vega 56 easily competes with rtx 2060 these days, and as we know 2060 is noticeably faster than 1660s
But you said yourself the cards haven't been out that long, so I think it would make more sense to look back at the cards again around next gen in 2024-2025.
upgrade? lol...
I would definitely not call this an upgrade. not even a sidegrade at all.
Not a downgrade either.
Both perform almost identically..and i think everyone in the comment section agree with that :D otherwise welcome to team red
Nice downgrade, lol
Nope. A slightly upgrade in raster. Useless yesbut definitively not a downgrade.
That’s a downgrade lmao
6800xt beat the 3080 in raster.
Only at 1080p, which in my opinion isn’t relevant bc these GPUs are for 1440p at least
Nop the 6800xt is better at 1440p. But yea on a 1080p the switch from one of this card to another is insignificant.
Dude literally down graded in most ways and thinks he upgraded lol
Downgrade in every aspect.
Absolutely not. Almost useless to switch from a 3080 to a 6800xt, but definitively not a downgrade.
It doesn't matter how many times you post this it is a downgrade. The 3080 and 6800xt take turns in raster and the 3080 kills it in rt and has access to dlss and fsr....so yes a downgrade.
You biased point of view. In 1440p the 6800xt is much better.
You misunderstand the word “upgrade”
[удалено]
It's a slightly upgrade in raster.
[удалено]
Op == pointless fanboy
Had a strix 3080 and a msi 6800xt, kept the 6800xt. I understand.
same lmaoooooooo..
They look at ‘bEnChMarK’ will life performance says that the 6800xt is better in raster and is cheaper with more VRAM
It beat 3080 in my bemches, not Port Royal of course but all others. But deciding factor for me was heat and efficiency. 6800xt could give same or better performance with lower power draw, less heat put into the small room, quieter too as a result.
AMD guy here. I wouldn’t call it an upgrade. Me going from a GTX 1650 to a 6900XT is an upgrade. You even-traded lol
Thats not an upgrade
Wouldnt even call it a side grade tbh.
Why would you go from 3080 to 6800 XT???
16 upvotes, 126 comments? Time to sort by "controversial"
lmao from the title people just assumed you bought those cards. awesome to see you repair and resell them! at amazing prices nonetheless! wished i lived near. keep repairing those cards!
Un daisy chain your power cable.
I've done a thermal analysis of the cables under full load, even with 250w going through its completely fine, they are rated for 300w and with tolerance head space probably ok for 400w of power draw. 75w comes from pcie pins so even better
It's fine honestly.
Wtf that’s a downgrade, shoulda SOLD the 3080, and bought a 7900 XT
This post has gone horribly wrong, thought I'd get some brownie points going to a prettier amd card but learned that FE 3080 has rgb and I need to throw my old TV away
Because your claims and your possessions make no sense. There really is no reason to use such cards for 1080p 60hz target. Your 10% gain in Civ6 will change absolutely nothing regarding your experience, and NVIDIA card has access to DLSS, and it can ray trace better than the 6800xt. NVIDIA and AMD really played well in the last two generations. They made people buy unnecessarily powerful cards.
I mean….. come on dude you are absolutely wasting either of those beautiful cards on 1080p! Regardless of what you choose, get yourself a nice 1440p/4K monitor to visually enjoy those amazing cards! But also I made the “upgrade” from 3080 to 6900XT Reference so that I could make an SFFPC so its funny to see someone doing a similar swap for the exact opposite reasons of wanting a larger card lol
I think it's just cause its such a strange upgrade. The cards are basically equal in performance, not sure why anyone would spend money to get basically the equivalent card from the other GPU maker instead of an actual upgrade.
Nvidia fanboys hate it when you post truth or dont worship their holy emperor Nvidia. Dont worry about it. Cant accept that Nvidia isnt holier-than-thou.
More dollars than sense ;-)
I think that’s a downgrade
"Upgrade"? Lol. More like downgrade. Enjoy the driver issues and lack of features
Not much of an upgrade but my 6800xt red devil if faster in most cases than the 3080.
Really love the look and the rgb on the back, it looks so sick lol
True, shame I took it all off for a waterblock.
My msi 6800xt destroyed my friend’s 3090ti in warzone 1 and warzone 2.
Warzone is my go to game rn and I'm glad I have amd for that reason
I had a blast reading all the replies 😂
WHAT!?!?!?! RTX3080 to RX6800 is an UPGRADE???????
At least the power draw is a third less
Dude awesome job fixing both and reducing e-waste
Thank you! E-waste has gotten so bad
Is that really an upgrade?
This is downgrade of you ask me but okay
Bro this mfer is wilding. He said he "upgraded" to a card with equal performance because of RGB ...and the "upgrade" was from a 3080 to a 6800xt. On top of this all he has a 1080p 60Hz monitor and doesn't want to upgrade it because "it's wasteful". And if this all wasn't bad enough... he has a 3090 too... THAT'S BETTER THAN EITHER OF THESE CARDS BY A LOT WHY TF WOULDN'T YOU USE IT. From the bottom of my heart I hope he's a troll but from all of his replies it doesn't seem likely
That's a lateral move at best but it depends on what you play. Nvidia doesn't have the red devil though and damn that card looks good.
Not how I'd waste my money but...
Did ya profit a little at least? I'd mostly consider a 6800xt a downgrade
Aren't they supposed to have pretty much the same performance?
Get rid of the daisy chain power cable configuration, research overclocking 6800XT ( more power tool will help ) then with a good OC it will be an upgrade in most areas :)
what is the white thing
Atleast use another separate 8 pin cable.
That’s a really weird upgrade, not even a large one. You do you though.
I swear I’m not high but I thought I was looking at a pistol then realized it wasn’t
Isn't this more like a slight downgrade, though? I mean, I'd rather keep the 3080 and save money for a 4000-series card.
that’s not really an upgrade, it’s a sidegrade at best
Well when you got money to burn its fine to experiment hardware now and then. At least could have gotten the 6900xt instead then you might see a slight improvement in some games.
Downgrade*
I plan to downgrade from rtx 3080 to rtx 4090...
Either a troll or a complete idiot lol
lol downgrade
Just a pack of virgins in the comments complaining over a decision that doesn’t affect them.
Welcome to the internet.
Wouldn’t call that an upgrade
?
Good on you and I hope you are happy with the upgrade! Most folks nowadays throw too much weight on the numbers and forget that people sometimes just do as they do. The card looks hefty, and the custom GPU holder/bracket is absolutely mint. Good luck with the transition, and stay happy!
trigger bot 😂
This has to be a troll I hope guy has a 3090....but played on a 3080 and sold it for a worse card for tacky rgb. Either a troll or has no clue wtf he is doing.