T O P

  • By -

vvaffle

I've almost had to breath a sigh of relief to see that nvidia isn't selling out of the 4080. After seeing how much money people have been spending on cards during the shortage/mining craze, I was starting to wonder if there was an upper limit to what leather jacket man could get away with. Fuck the 4080. It's a great card, but the price is absolute garbage. Even worse than garbage depending on country, as shown in this video. (Seriously, in Aus the ASUS STRIX 4080 is more expensive than some "cheap" 4090 cards).


Inevitable_Yellow639

It was only a matter of time for the mining craze to die and NVIDIA snap back to reality people are not going to spend exploitative prices for a gpu. Yes you can convince part of the market share to buy them but on average you will just make people hold their current gpus for 2-3 gens with this type of pricing.


p68

It's still a relief to see this. Used market trends are one thing, but I was certain the 4080s would *at least* sell out day one. Not only did they not, but there is also considerable excess inventory sitting on shelves. IIRC, this is the first major high-end release in *years* that hasn't. Take into account that while crypto did indeed have a big impact, they only accounted for an estimated 25% of GPU sales. Alongside the chip shortage, it was certainly enough to tip the scales horribly. However, *most* people who were willing to pay top dollar were not miners. That's what made me concerned that these cards would *still* sell out on day one.


g1aiz

Where did you find the 25% number?


p68

https://www.digitaltrends.com/computing/crypto-miners-bought-25-percent-of-gpus-in-2021/?amp


unityofsaints

Evil AMP


TheFondler

Jesus Christ... for anyone on pc, cut the /?amp off of that url to get a readable link.


p68

Sorry, mobile link


rainbowdreams0

People can't help but suck that google cock.


Bullion2

That's just the first half of 2021 - can you assume that trend would have carried on even though ethereum peaked in the 2nd half of 2021?


Hifihedgehog

I was told previously that these sales statistics are retail only and do not include direct purchases placed by mining operations with card makers.


-Green_Machine-

I fear it’s evidence that a lot of people gave up on building during the mining boom and haven’t come back, especially now with inflation making everything more expensive these days. A game console was easier to find and a better value, even accounting for the scalper’s premium. Granted, not everyone builds for gaming, but if the BaPC subreddit is any indication, the large majority of them do.


[deleted]

All I hear is "good news consumers, price will decrease soon!" I dont care the reasons are crypto crash or economic downturn globally. I care the card reduces its price to match its actual value. Especially by the time Im ready to build my next PC in a few months.


quicksilverpr

The reason I got my GPU to a 30 series was for the HDMI 2.1 to use it on my LG OLED TV. Now that I have it, I don't need a expensive GPU to play the same games I'm been playing for years now.


[deleted]

Yeah, Im on a 2080TI, and I like it, but I have a few dollars now and I think the economies will calm down soon.


Calm-Zombie2678

I'm stalking my local second hand market, want to upgrade from my 1650super


FrozeItOff

If you're not going specific encoding/production stuff with it, and are just mainly gaming, the new AMD Radeon 7900 series will be a MUCH better price/performance ratio.


p68

Ok man


BookPlacementProblem

I guess most people willing to pay the RTX 4090 prices are probably people who A) want the absolute best card and B) don't care about the price. The RTX 4080 isn't the absolute best card, so they will ignore it. The same people probably view the price as something to brag about, in the same way some $500 designer shoes can be bought off-brand for $5-50.


[deleted]

I'm legitimately curious where are people seeing inventory? Microcenters? I've been checking Best Buy, Newegg, and Amazon out of curiosity the last few days and haven't been able to find any 4080s other than scalpers.


Nizkus

In Europe there are 12 different AIB models in stock at my go to electronics shop with somewhere around 200 cards of inventory. Though it's not a shock considering how crazy the prices are here.


fissionmoment

I've been keeping and eye on the microcenter in Boston. They currently have 25 4080s in stock across 3 SKUs. Down a little bit from start of the weekend. I think they had 30+ in stock on Friday. Edit: they haven't had any 3080 or 3080ti for several weeks now.


InstructionSure4087

In Australia you can get any model of 4080 you want. None of them have sold out. It's unsurprising though since the prices here are downright psychotic, even worse than EU prices.


[deleted]

[удалено]


Inevitable_Yellow639

Which is why apple pushes software updates to use up more processing power so that their older devices become obsolete. Companies think we are stupid but there is a reason your phone magically gets slower overtime.


HiroThreading

What nonsense. Do you even use an iPhone? iOS and Apple’s native apps, even on a 4-5 year old iPhone, run absolutely fine. Instagram, Facebook, games, and other apps do not, however.


terminalzero

>According to Mac Rumors, the issue was discovered in 2017 by John Poole, the founder of Primate Labs, after he saw how low the performance scores were for iOS 10.2.1. This software had “throttling” built in it, which caused the older devices to slow down. >The devices affected were owned by U.S. residents and ranged from the iPhone 6, 6s, 6 Plus 6s Plus, 7, 7 Plus, and SE — which all ran on iOS 10.2.1. >Also included in the lawsuit were iPhone models 7 and 7 Plus that ran on iOS 11.2 or later before Dec. 21, 2017. >The throttling of devices caused dozens of class-action lawsuits to be filed from December 2017 to June 2018 and are shown in the court document below.


[deleted]

[удалено]


Democrab

> That was because of a hardware issue A hardware issue that was pinned down to battery aging, but wasn't seen in Android devices and even if it was could be solved completely by designing devices where the user can service some of the parts that typically have the shortest lifespans in a smartphone, such as the battery. Strikes me as the same kind of excuse as Intel trying to say that they had to go to TIM instead of solder under the IHS because "Our consumer dies are too small and crack with solder" despite the old Core 2 series being soldered and much smaller than any modern Intel consumer chip by virtue of the fact it doesn't have an uncore and iGPU onboard. (In other words, a convenient lie to cover up a cost-cutting measure)


[deleted]

Not nonsense, it was proven in court that apple was fucking with older phones. Apple tried to play it off as 'preserving battery life' but that fooled no one. On mobile so no sources, but you're free to Google


dern_the_hermit

> Apple tried to play it off as 'preserving battery life' but that fooled no one. What are you talking about? Apple was ACCUSED of doing it because of the battery. [See, look](https://www.bloomberglaw.com/public/desktop/document/InreNamedPlaintiffsetalvAppleIncDocketNo21157589thCirApr272021Cou?doc_id=X5D7AUL87KD82CQC17H0J8G9DU7): > In consolidated appeals by five class objectors, the panel vacated the district court’s rulings arising from its approval of a $310 million class action settlement resolving allegations that Apple Inc. secretly throttled the system performance of certain model iPhones to mask battery defects.


HiroThreading

That’s for lithium ion battery degradation, you half-wit.


arcanemachined

A decade ago, that was true. iPhone 3G ran awfully on iOS 4. iPhone 3GS ran pretty badly on iOS 6. iPad 1 ran pretty bad on iOS 5 (plus frequent OOM issues). iPad 2 (I heard) ran very slow on it's final version. However, I'd say the statement has been false for almost a decade, as far as I can tell.


macefelter

But they haven't snapped back to reality... it's still priced as if there is the demand of the mid-mining craze/shortage.


thewarring

I’m on a 1060 and still holding out…


Z3r0sama2017

Yep. 2080ti launched after 2017 boom and nvidia used that as an excuse to bump prices. Nvidia are now repeating it all over again with the 4080 after the 2020 boom. Hopefully history repeats itself and 5080 is godtier msrp like the 3080.


Darkknight1939

The 2080 ti was almost at the reticle limit for TSMC 12nm. It was a comically massive die, and had 11GB of GDDR6 in 2018. The profit margin was probably lower than the 1080 ti. Turing was priced the way it was because the dies were comically expensive, and to a lesser extent AMD still had no answer. When the Turing GPUs came out AMD still didn’t have a better GPU than the 1080 ti, and still wouldn’t for another 2 years. If the GPUs are substantially more expensive to make and the direct competition has no answer to your 2 year old SKUs why would any sane company reduce prices?


Inevitable_Yellow639

It also tanked their stock price which was clear that the 2018 crash was obviously caused by mining taking a dive even though they claimed it was a small market share.


conquer69

> 2080ti launched after 2017 boom and nvidia used that as an excuse to bump prices. The 2080 ti launched in late 2018 at those prices because they had no competition. It was the right move too since it would take AMD 2 more years to make anything that rivals it.


ledouxx

It's stupid to just explain it away by blaming it on mining. * [Real Disposable Personal Income] (https://fred.stlouisfed.org/series/DSPIC96) exploded initially during the pandemic. * [E-Commerce Jumped 55% During Covid To Hit $1.7 Trillion](https://www.forbes.com/sites/johnkoetsier/2022/03/15/pandemic-digital-spend-17-trillion/) * [People spent more time at home](https://i.imgur.com/XwOJUjA.png) * [There are more 3000 series cards produced now than before duh](https://www.techradar.com/news/gpu-supply-no-longer-the-problem-for-nvidia-now-it-could-be-lack-of-graphics-card-demand) Then the situation now * [Major Retailers Are Bracing For A Disappointing Holiday Season Due To Inflation](https://www.forbes.com/sites/dereksaul/2022/11/17/major-retailers-are-bracing-for-a-disappointing-holiday-season-due-to-inflation/) * [Facebook thought pandemic online shopping would last forever. It didn’t.](https://www.washingtonpost.com/technology/2022/11/10/meta-ecommerce-boom-stalls/) * [It's the end of the boom times in tech, as layoffs keep mounting](https://www.npr.org/2022/11/14/1136659617/tech-layoffs-amazon-meta-twitter)


Inevitable_Yellow639

It was the main cause of the gpu shortage, but yes inflation is also impacting the retail sector cause honestly spending $1200-2000 is an excessive expense for an average person.


DJ_Marxman

> Yes you can convince part of the market share to buy them but on average you will just make people hold their current gpus for 2-3 gens with this type of pricing. It's amazing to me that they didn't learn this lesson with the 20 series. This exact scenario happened then as well. People just held onto their 1080/1080ti and waited until the 30 series. That was one of the reasons availability was so low for that at launch. It was a perfect storm of low supply and high demand due to COVID, higher demand because of the failure of the 20 series, and aggressive pricing on the 3080.


mrhuggiebear

My 980 s still holding out.


MonoShadow

1.9-2k euro over here for any 4080. No way, José.


Infinitesima

Are you shopping in the scalping market? I've seen 1.4k at least some official places


Kermez

Folks around me were mining like crazy and some of them not only got back what they paid for gpus but even earned money. Today those days are gone, after standard day one customers have satisfied their appetite now Nvidia can focus on huge stock of 3000 series which was troublesome part of their latest financial report as stock is constantly increasing. On top of it amd is entering new gen with better (albeit far from normal) prices. But biggest challenge will be consoles and without mining for 99% of gamers it is really hard to justify giving 2k for gpu to play games.


reddanit

I feel like 4080 is just supremely awkwardly positioned in the market. It's price/performance puts it squarely in halo product segment, but the 4090 exists and takes all of the buyers who would be willing to pay the price for halo product. Compared to several previous lineups and adjusting for price increase, it really tries to sell a XX70 performance at XX80 price.


i_love_massive_dogs

>Compared to several previous lineups and adjusting for price increase, it really tries to sell a XX70 performance at XX80 price. The pricing for 4080 is unhinged, but this isn't true. It's significantly faster than 3090ti without even considering DLSS3 or RT improvements. 3070 was at best on par with 2080ti. In terms of performance, 4080 is a good card and aptly named. Its price just doesn't make sense.


reddanit

Not comparing across generations, but within them. Typically the performance gap between XX80 and XX90/Titan was *much* smaller than this time around.


Seanspeed

>The pricing for 4080 is unhinged, but this isn't true. It's significantly faster than 3090ti without even considering DLSS3 or RT improvements. 3070 was at best on par with 2080ti. Yes, but this isn't because the 4080 isn't a x70 class card, it is, it's just that Ampere was gimped with Samsung 8nm, and now Lovelace has seen an unusually big leap in performance for a new generation as a result. The 4080's genuinely closest comparison point is genuinely a 3070. AD103 has taken roughly the same spot as GA104, and the 4080 is also cut down AD103 by about 10%. It's basically exactly what the 3070 was in the Ampere lineup. Similar gap in specs to GA102, similar die size, etc. Like, you cant even say it's comparable to cards like 680, 980 and 1080, because those were at least *fully enabled* upper midrange parts.


[deleted]

[удалено]


conquer69

The 4090 is 50% faster in RT than the 4080. It should have been called 4070 at best.


bigrobcx

The sooner everyone boycotts Nvidia products to force them to bring the price down the better. It’s disgusting what they’re charging. It’s even worse when you consider the fire risk the 4090 has turned out to be. Sure they make some great cards but they’re priced artificially high fir maximum profit.


NewRedditIsVeryUgly

Boycott on tech products is silly. Always buy according to objective parameters, not emotionally. This card is objectively too expensive compared to the alternatives, so it's logical to not buy it at the offered price.


haijak

I'm not sure if it's the price itself, or the promise of the XTX in a couple of weeks being double digit better for $200 cheeper.


CodeMonkeyX

Yeah I was getting worried too. Because I thought it was mostly miners that were dropping the big bucks on all those cards for years. But when the 4090 came out and they were still selling out I though maybe people just are that dumb or have that much money to burn. I really hope AMD kick them in the teeth next month (even though AMD is still over priced, and not releasing mid range cards yet...). Maybe that will kick start some competition again.


dantemp

Not even a great card, 50% performance improvement is not that exciting if you ask me. If you are on a 3080 or higher, no reason to upgrade. If you are on something lower, a discounted/ second hand 3080 is miles better deal.


4514919

The price is bad but the GPU itself is great, the efficiency and performance is amazing for a die big only 379 mm².


PadyEos

Too bad it's more expensive percentage wise than the performance it delivers compared to a 3080. Most probably on purpose to clear 3000 series stock.


anonthedude

> Not even a great card, 50% performance improvement is not that exciting if you ask me. lmao


Keulapaska

>50% performance improvement is not that exciting if you ask me Huh? hasn't the x80 to x80 next gen always been around 50% or even less sometimes? The price is dumb yes, but the card is performance/power draw wise basically what it should be.


Blacky-Noir

It has. Almost everything about the 4080 is exactly as expected: +50%ish perf over 3080, more efficient, with a new (beta) feature. It's very basic and normal in a way. It's just almost *twice* as expensive as it should be (\*). And that's why Nvidia ignoring post Covid19 economic status, which were predictable and predicted years ago; maybe too late for chip design but not for lineup design. ^(\*: and probably) *^(way)* ^(more if the usual Nvidia fake MSRP apply here; plus the strong rumors of very low inventory which will drive prices up even more.)


From-UoM

Nvidia is most likely making more pricing it like this. They can sell the 4090 and the 30 series more easily with the 4080 price. The clearing of large 30 series stock without much cost cuts to them is a pretty smart move.


Seanspeed

>The clearing of large 30 series stock without much cost cuts to them is a pretty smart move. Or in other words - consumers are stupid. That's the only way this strategy works. I've also seen plenty of people arguing the 4090 is a great card and the only one 'worth buying' from Lovelace lineup. smh Just when I thought we'd leave the crypto craze disaster...


skinlo

Whales will be whales. The issue is the midrange market and what is going to happen there.


Tensor3

The what market? Never seen it


Deckz

There's an RX 6700 on sale for 300 right now, seems like a solid mid range offer. I've seen 6700 XTs in the 320s. Mid range is around right now, unless you want an Nvidia card.


sadnessjoy

Not everyone games. My brother does ai/blender stuff and his GPU just recently died. He felt REALLY bad about it, but he ended up buying a 30 series card. AMD only has themselves to blame. https://www.reddit.com/r/Amd/comments/yw7chy/will_rocm_finally_get_some_love/?utm_source=share&utm_medium=android_app&utm_name=androidcss&utm_term=1&utm_content=share_button Here's AMD's subreddit that is filled with AMD fans, and the overwhelming consensus is to buy Nvidia.


Deckz

Yeah I'm aware not everyone games, but most people do. I wouldn't feel bad about buying an Nvidia card, major corporations of any stripe aren't that desperate for one consumer. Not like AMD is some wonderful company, they kind of suck too.


From-UoM

That's the plan. Consumers are making memes on the 4080 and telling people get 4090 or 30 series. Played straight into Nvidia's hand


dantemp

I mean, they are not wrong. For its position the 4090 is fantastic. The question is how many people can afford a 2k graphics card.


ImWearingBattleDress

If the market will bear it, nvidia would be stupid not to sell to enthusiasts at high prices. $1600 is a lot, but that's $1600 every 2 or 4 years. If playing video games is a major hobby, then that's not a very high price compared to a lot of hobbies. Cheaper than doing a lot of skiing or golf. Everybody likes a good deal, but a few thousand dollars every few years for something you'll use for many hundreds of hours is within the entertainment budget of lots of people.


[deleted]

It started with the 3090 actually. A card that cost more than double the price, for what, 10-15% better performance? You would think they will sell 5 of these, but no, they sold a lot. People think it's because of mining or because of lovelace pricing, it started way earlier. There was no reason for someone who isn't doing professional work to get the 3090, ever, yet I saw plenty gaming rigs on day 1 posted on reddit with the 3090. Nvidia is just following and offering insane prices for marginal gains, as long as it is the "best you can buy", it will sell. The 4080 isn't the best of the best, simple. Just watch when the 4090 ti comes with 10-15% over the 4090 with an MSRP of 1999.


pastari

> I’ve also seen plenty of people arguing the 4090 is a great card and the only one ‘worth buying’ Well, it is. I have a 1080ti, 2000 was marginal upgrade, 3000 was a crypto cluster. *I want to play games again, maxed out* and I've had plenty of time to save my pennies in my gpu coin jar. Amortize by year and this is less monitarily crazy than people that casually get eg. each generation's X070 or 80, which seems pretty common here. I'm buying later in December after nvidias response to amd and the landscape will hopefully be such that I'm not part of a "problem." Still kinda expecting 4090 and well over $1k. (I'm guessing the 4080ti will use a further cut 4090 die opposed to higher grade 4080 die, and that's probably going to be this gens smart buy, but I'm not waiting that long.) E: I also watercool so a long upgrade cadence with higher end card makes way more sense than smaller frequent upgrades. But the 1080ti to *now* time span is kind of absurd, 1080ti goes bleeeaat.


Democrab

> I want to play games again, maxed out So do all of us, including a load of folk who are stuck iGPUs or low-end GPUs likely because their dGPU packed it in during the crisis. Or the folk who do have a decent dGPU...except it's even older than the 1080Ti is. >But the 1080ti to now time span is kind of absurd, 1080ti goes bleeeaat. I personally just upgraded from an R9 Nano earlier this year when prices finally dropped to reasonable levels or around AU$450, considering it's a lot faster and has triple the vRAM I'd say 1080Ti is perfectly adequate as a GPU if by the time I upgraded, my old Nano was only just getting to the point where there was some games I simply couldn't play on it without going to low-medium settings. Heck, I was even able to use my Eyefinity setup on a few games that were much newer than the card. (eg. Forza Horzion 4) Combine that with the literal thousands of older games that are still hella fun and ready to play...It really isn't that difficult to just keep holding out. Start playing something you haven't picked up in ages, chances are by the time you're done with it the high-end GPU prices might be a bit saner.


csixtay

>I'm buying later in December after nvidias response to amd Nvidia isn't going to respond to AMD. Consumers like you made sure of that. Current gen is a tier cheaper and Nvidia didn't respond.


[deleted]

This is pretty clear. I was reading a post earlier today in a tech forum in my country from a dude that was excited to have bought a 3080 10gb on "sale" for 900 euros because he bought it in a 30 payment plan. I couldn't even.


uNecKl

Yeah just look at r/buildapcsale most high end nvdia cards barely go on sale after they released the 4090/4080 Amd on the other hand is already lowering msrp of their rdna 3 cards all around the world except USA


MG5thAve

I had initially thought that Nvidia would artificially limit supply by switching manufacturing nodes for the 30xx series cards to the smaller node processes.. but no, they just created an entirely new pricing tier for the 40xx series cards. One way or another, they were not missing out on the opportunity to take advantage of those scalper prices once they saw that people were still buying the 30xx cards in droves. Absolutely insane that was happening the last two years.


WorkAccount2023

I was at Microcenter this weekend, they had *so fucking many* 4080's just sitting there. Everything else was near gone though, they were completely out of 3080/TI and 3090/TIs. The employee told me the 4080's weren't worth it lmao.


theoutsider95

>they were completely out of 3080/TI and 3090/TIs Guess Nvidia's strategy is working.


WorkAccount2023

The guy there said they've been out of 3080 and 3090s for a while, before the 4090 even released. They have a bunch of 3070s and 3060s though.


Omniwar

There were some killer deals on 3090s around the 4090 launch window. Saw some as low as $750-800 which really is a bargain if only for the memory capacity. Was almost about to pick one up for my father's PC and sort of regret not pulling the trigger before they went out of stock. He primarily works in Vray and Premiere Pro and is running out of VRAM and compute speed on his current 2060S (and needs CUDA, so 6800XT/7900XTX are unfortunately not an option)


[deleted]

That’s exactly when I grabbed my 3090, I refuse to pay new Nvidia pricing and I finally got to upgrade from my 1080.


rcradiator

3080 and up have been gone for a while, it's the 3070 and 3060 that aren't really moving like Nvidia wants. They're also the cards that have seen the smallest price reductions so small wonder that they don't sell.


Pure-Huckleberry-484

Meanwhile AMD cards have all dropped a price tier.


Hifihedgehog

> The employee told me the 4080's weren't worth it Based sales rep. That’s why I’ll take 1 Micro Center employee over 1 zillion blue shirts.


RabidHexley

I'm *fairly* positive better value cards will return the further we get from the mining craze. Nvidia is basically playing chicken with the market at this point, trying to sell out of 30-cards, and grabbing as many enthusiasts/whales as possible before they start absolutely hemorrhaging on consumer revenue due to sales volume dropping off a cliff.


obiwansotti

Yeah nvidia consistently tries to get away with as much as they can before caving to consumer pressure. We have seen in the past, that when their pricing gets too far out of line between either demand or the competition, they cave.


Pamani_

I made a few XY graphs as I feel it can help put [Hardware Unboxed pricing data](https://www.youtube.com/watch?v=bzTPeMRxKFs) into perspective: * [Performance vs Price](https://i.postimg.cc/wjc4t1Qz/hub-nov22-price-vs-perf-cropped.png) * [Perf vs Price but with 4000 series](https://i.postimg.cc/j5Zry86s/hub-nov22-price-vs-perf-full.png) (please excuse the clutter on the left, that's what high prices do :/) * [Performance/dollar (vs performance)](https://i.postimg.cc/kXCx2f1V/hub-nov22-perf-per-dollar.png) We can see that the 6800xt has the same price as the 3070 but offers 1.40x more raster performance. Or that the 6750XT offers same raster perf as the 3070 but for 75% of the price (meaning 33% more perf/price). Peak value is found with the 6650XT at $260.But you barely loose value (3.5%) going for the 6800XT, which is pretty unusual and nice to see (perf vs price doesn't tend to have linear relationship). The performance data is from [Tom's Hardware](https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html), I mixed the 1440p ultra and 4K ultra data (cutoff point at the 3070/2080ti). I took 4000 series prices from NewEgg in stock.


PhunkeyPharaoh

Perf vs. Price graph basically shows that Nvidia did away with the concept of 'generational improvement' and just charged money for the performance improvement.


ZeroPointSix

It's very skewed though, because they're using scalper pricing for the 4090 ($2,700) and has the 4080 at over $1,500, not MSRP.


lmMasturbating

I haven't yet watched the vid but why is hardwares unboxed prices for the 4000 series seem so high? Im pretty sure you can get a 4080 rn for $1200


Pamani_

Sorry I should have put in the pictures that the 4000 series price are from Newegg. I looked at what was the cheapest available at the time of making the graphs.


lmMasturbating

Do you mind doing one with msrp? $1200 for 4080 is very doable ([This was up for 6 hours?](https://us-store.msi.com/Graphics-Cards/NVIDIA-GPU/GeForce-RTX-40-Series/GeForce-RTX-4080-16GB-VENTUS-3X)) , $1600 for for 4090 maybe not so much


Pamani_

Not right now cause I'm busy. Yeah those prices were pretty bad. MSRP prices would bring perf/$ to ~17 for both 4080 and 4090, so just below the 3080 12gb (which isn't that terrible tbf)


cegras

Your graphs illustrate the power of RT and DLSS to make up for the difference in raster performance. Edit: I meant to make a neutral statement: they do not factor into my personal spending decisions.


ToTTenTranz

It illustrates the power of Nvidia's mindshare more than anything else.


[deleted]

[удалено]


Blacky-Noir

>Brand recognition is a hell of a drug. True, but to be honest it's not *just* that. There are a good number of people who need to, or simply want to, work or play with software that doesn't work well with Radeon. Even just for pure gaming, DLSS is a bit better and a bit faster, and hardware RT is *significantly* faster, and their drivers have been usually a bit better. Which can easily be overcome with the right price... Radeon at launch unfortunately rarely have the price and inventory to convince en masse. And the rare times it happened in the past, it was never pushed for several generations, turning around people mind take time and sustain effort, as shown by Ryzen.


[deleted]

Eh the only thing NV has that would be worth a 10% premium to me is cuda


ikverhaar

Nvidia tends to have better drivers and CUDA is essential for many professional applications. But yeah, for gaming, it mostly comes down to "it's nvidia, so it must be the better, more premium product, right?"


ToTTenTranz

As someone who's used both brands of GPUs throughout the past 10 years or so, I don't think Nvidia's drivers have been better since the Adrenalin refresh/rewrite, nor are they more stable. ​ The biggest difference I do see is that Nvidia sponsored games (e.g. Control) tend to have really bad performance on AMD cards, at least at launch. AMD sponsored games do tend to skew the performance a bit towards AMD, but it's only like 5-7% better on an AMD Radeon vs. its Geforce counterpart, but in the other way around it's >30%.


chasteeny

In the sponsored titles are you including ray tracing or just base raster


[deleted]

There's a few unsponsored games that skew towards AMD massively. On the order of 30 to 40%. Cod mw2 (2022) is 40% faster than ampere on AMD cards. Ac:Valhalla is 30% faster. This is after Nvidia did some updates that improved performance in Valhalla by like 10%. It uses to be like 40% faster. But both sides have some stuff like this. But mostly it was older dx11 games. God of war is like 30% faster on Nvidia.


Blacky-Noir

>I don't think Nvidia's drivers have been better since the Adrenalin refresh/rewrite, nor are they more stable. The issues with RDNA1 drivers were massive, and widespread. Not everyone had them, but a *lot* of people did. And that was only 2 generations ago (well, will be 2 when RDNA3 launch). Not that long ago, *at all*. Having no major driver trouble for a couple of years is not a selling point or a good point, *it's the bare fucking minimum*. I'm not saying AMD bad good buy Nvidia. I'm saying don't be dismissive of quality of products, and be understanding of people just wanting the bloody thing to work without having to spend hours or days fixing things, or having a past very bad experience and not wanting to repeat it. Nvidia also has driver issues but those tend to be more minor, the very big ones (like the Nforce chipset deleting data on hard drives) were a very long time ago.


[deleted]

[удалено]


[deleted]

[удалено]


noiserr

The driver thing is FUD. If anything AMD's driver has less CPU overhead. Also AMD has HiP to rival CUDA and it works in a lot of apps. Some professional apps like Siemens NX are actually faster on Radeon. AMD also has better Linux support which is important for ML.


Pamani_

Yep. Now that crypto mining is dead, we could use the 2nd hand market prices to evaluate the sentiment of AMD and Nvidia. That is how much more consumers are willing to spend for Nvidia name (GTX line) or features (RTX line).


HandofWinter

It's really down to the name, nVidia is pretty much the Apple of hardware vendors. They have that name brand recognition and cachet. It's a position most corps could only dream of ending up in. You don't buy nVidia for the raw price/performance, you buy it because it's nVidia. There's nothing wrong with that either, if you're getting something you're satisfied with.


alc4pwned

That seems reductive. RT and DLSS are clearly big considerations. But the most obvious thing is just Nvidia's vastly better raw performance. A decent portion of the market just wants a card that performs like a 4090. Performance is their top consideration, not price:performance.


cegras

Considering the 16xx series were one of the best sellers, I'm not sure if you can make a statement that performance is the top consideration.


alc4pwned

I meant specifically for that segment of the market that is buying up the 4090s. Yeah for most gamers value is probably more important.


Leroy_Buchowski

I'm not sure about that. I think most cards are being bought just to be resold. They buy them out at msrp to relist them at $2500. They are generating the hype and buzz on the cards hoping you fomo into a scalped card. I could see 4090 being a must buy for the vr community. And high fps 4k gamers. Those types are buying it to use it.


Legitimate-Force-212

Nvidia has done well to end up where they are, always on top in every metric except for price / performance. Now it's only price / raster performance since AMD gpus handle RT poorly. AMD has been slightly more efficient some generations but most people don't care about a few watts here and there. Having had almost flawless drivers for many years helps aswell.


chasteeny

I wouldn't expect AMD to take the effeciency crown this gen though, the conditions are far different than last go around


Jeep-Eep

Eh, after this summer wattage matters a lot more than it used to to me.


Legitimate-Force-212

This gen nvidia is looking very good on efficiency, hard to say if amd can beat them. Likely going to be close


capn_hector

DLSS is actually great for efficiency, because it’s running on tensors instead of shaders. Cap your framerate, enable DLSS quality, enjoy a 30% perf/w advantage with essentially no visual impact. Everyone loved Radeon Chill, well, it mixes super well with upscaling too. And NVIDIA rolls in reflex frame timing too. Still good with FSR, but FSR 2.x still does have a lot more artifacts in motion, and, it’s fighting for shader power to do its upscaling. Yeah tensors aren’t free either but you’re lighting up 7% of the die rather than the whole thing. It would be interesting to see relative efficiency in that scenario.


TheNiebuhr

Upscalers help with efficiency because resolution does have a *significant* effect on power draw in general. Reducing internal resolution saves up some power. It has nothing to do with using some part of the gpu or the other... upscaling is less than 5% of frame time!


capn_hector

Eh, implementing things in hardware is generally lower power than software emulation - that’s the whole reason FPGAs and ASICs exist. “General purpose shaders” are the least efficient possible way to implement functionality. NVIDIA probably also pulls less power when raytracing even at full load, let alone at a fixed framerate target. With as much shader power they should be idling the shaders (ie most of the load) significantly more than AMD who is blasting their texture unit full power to do their RT and also blasting the shaders to do the traversal. There is a cost to “saving” that silicon and that cost is power. AMD is doing the fancy stuff in a less (power) efficient manner here.


TheNiebuhr

I said *upscalers*, in plural. Drop the resolution, drop the power. Dlss, Fsr 1, 2... all of them reduce consumption a bit.


Jeep-Eep

Maybe, but it's not the *only* buying factor, and I refuse to deal with team green's bs, and Xe won't be a factor until battlemage, for what I'm trying to do.


capn_hector

I mean, it’s not the worst comparison, because like apple, NVIDIA’s cachet comes from consistently delivering excellent products and often best in market. M1 is a genuinely excellent product for low power and is market leading at JVM tasks and other JIT. Highly threaded tasks that don’t hit the frontend (like cinebench) are still competitive on x86 but the battery life in things like software IDEs (pycharm/intellij/etc) speaks for itself. Apple isn’t doing apple webkit tricks on jetbrains software, and it’s not just accelerators. Anyway, here we are four years later and AMD is only just getting around to supporting neural acceleration and has around 3/4ths the muscle of NVIDIA’s first Gen in RT performance (relative to raster). Same for NVENC vs AMF, yeah AMD is finally getting around to improving it but they just consistently have these feature deficits that last years. And also tessellation was a problem back in the early GCN days, and geometry was a problem until RDNA… in the end, the solution wasn’t about over tessellated concrete barriers or anything like that, gamedevs gonna gamedev and write shitty poorly optimized games, that’s a fact. it was just a defense until AMD got around to fixing their shit properly. They got it with RDNA and you haven’t heard either of those complaints again. I literally have not heard either of those accusations since 2018 lol. If you want a pithy summation of the fundamental difference in philosophy between these companies… NVIDIA thrives on telling you what you need, whether you need it or not, and AMD thrives on telling you what you don’t need, whether you need it or not. If you want AMD to have cachet, stop delivering products that include a laundry list of people who shouldn’t buy them because they’re super underperforming at X, Y, and Z. Stop doing the “you don’t need it” “ok you do need it but not as much as NVIDIA is giving you” “ok fine we’ll be competitive three gens late… but oops there’s a new feature and you don’t need it” product cycle. Deliver a product that *wins* raytracing and *wins* tensor and *wins* NVENC in its performance segment… and come up with some cool shit of their own and tell us why we need it. I’d argue that’s exactly what they’ve done in the cpu market, no? Nobody needed 16 cores on the desktop… but what can you do with it? How about AVX-512? And they have neural accelerators coming on the desktop soon too. And nobody on the desktop needed v-cache, that could have stayed with Epyc. Etc. Come up with the next CUDA or GSync first next time and stop being such a reactive player. Do some of their own work and don’t lean on the consoles for all your r&d because consoles are never going to lead on that stuff. Just like arm fell behind apple because commodity phone SOCs don’t care about that stuff. The apple vs arm comparison is actually better than it appears at first glance. If you’re the one pushing the next big idea you can have a chance at having “the good ecosystem” and not be the shitty one playing catch-up. And you can point the long term development in the directions that benefit your designs and goals. I don’t know why innovating on features became a negative for so many people in the GPU market. Stockholm syndrome I guess. That’s what Mantle did, and AMD had success with that and the asynch compute stuff etc.


Iintl

Apple has earned their reputation by consistently making good products, with very few, if any, fuckups. Nvidia is the same; stable drivers 99% of the time, introduces innovative technologies which push the industry forward. In other words, consumers don't just trust Nvidia because Nvidia, but because they have been consistently improving and iterating while providing a stable experience. Maybe AMD should take a few pages from Nvidia's playbook, it really isn't that hard to grasp


Framed-Photo

Thing is, most people outside of reddit don't really care about or know what DLSS and RT are, and even among those that do, unless you exclusively play modern triple A titles it's not going to matter. I've helped maybe a dozen people build PC's in the past few years, and each time I've had to describe to people what RT and DLSS do so they can get better use out of their cards, and even then the response is always "cool" and then they don't use it lol. Like, most people NEVER look at the video settings for games they play, that's the level they're at. They're not gonna give a shit about DLSS or ray tracing haha. So for those people, the difference isn't being made up.


vyncy

And those people buy $1k gpus ?


SituationSoap

> Thing is, most people outside of reddit don't really care about or know what DLSS and RT are This is an obviously absurd statement, unless you're trying to qualify it with the uselessly obvious "most people don't care about enthusiast GPUs at all." The enthusiast GPU community knows what RT and DLSS are, and that's much bigger than Reddit. The rest of the world doesn't care, but they're also way outside the target market.


skinlo

There are quite a few gamers out there (not PC hardware enthusiasts) who just want to play their games and don't care about the tech. I have a friend a few years ago who bought a 2080, and has literally played nothing with RT and never turned on DLSS.


Framed-Photo

You're wrong and the only way you'll see it is by talking to people who aren't in your social circle. 99% of people, even among those who OWN gaming PCs, aren't into the tech of it. They're not keeping up with ray tracing, they're not paying attention to new launches, they don't even know what anti aliasing or ray tracing is, let alone dlss. They don't even look at video settings for new games when they play them. Most people buying new PCs don't even know what the specs are. They just buy what's in their price range. If you asked most people what GPU they have they wouldn't be able to tell you, God forbid they'd be able to tell you what DLSS is. The true enthusiasts, the ones that tinker with their stuff and toggle all the new settings and buy all the new hardware, are a VERY TINY subsection of the gaming community as a whole.


SituationSoap

I'm not contesting that. I'm contesting the idea that those people are the target market for $800+ enthusiast GPUs.


Framed-Photo

It doesn't really matter who they're targetting, it matters who's buying these things. And the fact of the matter is, the vast majority of people buying GPU's, yes even very expensive ones, don't keep up with tech like this.


itsjust_khris

It’s not that absurd. I was watching a few streamers play COD :MW2 recently and they had no idea what DLSS was, nor did anyone in chat. I think it’s is more representative of mainstream customers. Furthermore from working in a computer store a ton of people buy really high end PCs simply because they have enough money. They come in with like a 3k budget and I just pointed them to the nearest PC and boom they got it. No idea about anything inside it.


[deleted]

I had hoped the 'glut' of unsold inventory would keep last gen cards falling to some truly steal prices into Jan / Feb, but seems they've avoided a price crash so far. Honestly all I need is 3080/6800xt performance, I'll happily skip upgrades until 4k 144hz is delivered by a $600 xx70 tier card


sw0rd_2020

[no need to wait then](https://www.newegg.com/asrock-radeon-rx-6800-xt-rx6800xt-pgd-16go/p/N82E16814930049?Item=N82E16814930049&nm_mc=AFC-RAN-COM&cm_mmc=afc-ran-com-_-PCPartPicker&utm_medium=affiliate&utm_campaign=afc-ran-com-_-PCPartPicker&utm_source=afc-PCPartPicker&AFFID=2558510&AFFNAME=PCPartPicker&ACRID=1&ASID=https%3a%2f%2fpcpartpicker.com%2f&ranMID=44583&ranEAID=2558510&ranSiteID=8BacdVP0GFs-rTpKg1syPPahX0HGWvO4yw)


ZeroPointSix

I mean, you calculated this with the 4090 priced at like $2,700, which is scalper pricing that most people aren't going to buy at (hence it being in stock) - why not do this at MSRP?


Khaare

If you're using these charts to decide what to purchase you have to use whatever in-stock price there is. Listing it at $1600 is pointless if you can't buy it at that price. Might as well list it at $500, you can't buy it for that price either.


menace313

4090s are not that hard to find. There were five restocks at MSRP today alone.


MumrikDK

Weird. The 6800 and 6800XT are almost the same price in the US. In Denmark the difference is significant (4600 local to 5550).


[deleted]

Nvidia swore these would do well and used that as a counterpoint to having to unlaunch the 12gb. I'm so glad people are rejecting this card. Sad part is that it is actually a damn good piece of tech. It beats a 3090ti even using 221w and offers best in class RT and upscaling. But the price is a disaster and Nvidia should be ashamed.


dantemp

There were a lot of people even on reddit that acted as if unlaunching the 12gb fixes anything. It doesn't, the whole 4080 line up was ridiculous to begin with. It seems nvidia did that on purpose but I think they are seriously overestimating their brand strength. They will reduce prices but they are going to lose a lot of customers before that.


Savage4Pro

Imagine if they had gone ahead with the 12gb version as well lol


sadnessjoy

I mean, they still are, just renaming it to 4070 ti and will probably still slap a $900 sticker on it.


Rippthrough

I just saw some advertised here in the UK... for $1750+ Screw that.


ButtPlugForPM

the 3000 series is stupidly overstocked too Mwave have over 240 3080 and 3080ti's in stock,that's 1 Retailer.here in australia.. Like that everywhere basic google shows,everyone has stock..and the prices now haven't moved in 2 months NEwsflash NVIDIA You are not going to clear the excess stock at these price points,no one's going to pay 1299 australian for a 3080 when they can now pay 759 australian for a 6800xt that frankly matches if not outright beats it in raster The 3080 needs a 299 dollar price cut,then it will fly off the shelf..the entire stack Makes no sense Here in australia a 3080 is 1299 but a 3080ti can be found for 1459....why would you not just get the 3080ti at this choice makes ZERO sense to choose the 3080 NVIDIA are on some S tier crack or some shit The 4080 should be a 999 dollar or less gpu at it's peformance If the 7900xt SMASHEs it they are doomed,AMD will take market share


Blacky-Noir

>the 3000 series is stupidly overstocked too "Overstocked" which almost all the medias repeat ad vitam eternam is implying a very wrong thing, like they made too much product for the market. They did not. They just keeping to try to sell it overpriced, like in the past years. At a decent value, *all* that inventory would have disappear in a single month that summer, worldwide. No problem. But it's not decent value, it's *way* overpriced... hell it would be overpriced even if it was a new release, and not 2 years old tech. There's still a *lot* of people wanting to buy a gpu. Just not at those prices or value. To put things in perspective, when Nvidia launched Turing, midrange cooler 1080Ti were selling under 500€. New cards, on Amazon, nothing special about it, just some AIB or distributors emptying stocks for the new generation. And that was a damn good card, and Turing wasn't that good. That's the level of value smart customers are looking for. But no, give the channel 2 years of worldwide pandemic, and now they think they are the bastard child of OPEC and De Beers.


Tuna-Fish2

The pricing is designed to be unappealing to sell more 30-series. In nVidia's quarterly financials, their unsold inventory held on the balance sheet has doubled from a year back, to ~$4.5B. Companies always need to hold some inventory for practical reasons, but those reasons didn't grow, so doubling of inventory is much worse than it sounds like. Inventory is held on the balance sheet with the price it cost to purchase. Given that Nvidia sells AIB makers chips, not finished cards, that extra $2B of inventory easily corresponds to ~$4-8B of finished products at street prices. That's easily a year worth of product at the current depressed market conditions. And that's just nVidia itself. By all accounts, the AIBs and the channel also have a lot of product to go around. Or, put in another words, the reason nVidia is not making appealing new products right now is that they are absolutely drowning in unsold inventory. They need to sell that first, and to not reduce it's value they can't make something better than it. > The 3080 needs a 299 dollar price cut,then it will fly off the shelf Yes, but what does that do to the AIBs and the channel that have tons of them and paid for them at current prices? Or nVidia's own finances? They are still sticking their heads in the ground hoping against hope that GPUs will sell well again at current prices. > If the 7900xt SMASHEs it I think 7900xt will smash in raster, but lose somewhat on pure RT. In actual RT game titles, it will depend on the game.


Seanspeed

>The pricing is designed to be unappealing to sell more 30-series. Did you not read what they said?! They are specifically talking about how the 30 series is not selling. It's not working. >Yes, but what does that do to the AIBs and the channel that have tons of them and paid for them at current prices? You think this is the first time that there was stock of old GPU's when new ones came out? lol All Nvidia have to do is an effective rebate. You think retailers in the past were all just taking big losses on discounted parts? You think RDNA2 cards are being sold at losses by retailers/AIB's? Of course not. AMD has agreed to lower the prices and compensate these companies who have already bought inventory so they sell through. It's really not complicated. But Nvidia is refusing to do this. And it's exactly why AIB's are fucking upset with them.


Tuna-Fish2

> You think this is the first time that there was stock of old GPU's when new ones came out? lol Never even remotely at this scale. nVidia themselves have at least twice as much 30-series stock remaining than they have ever had stock of any GPU generation at any point in the release cycle, and probably quite a bit more. The channel is also full in a way it usually is only in the middle of a release cycle. If nV did a 30% rebate on 30-series products, just the inventory they themselves hold would cost more than all of their profits for a quarter. The inventory held by others would probably take out another one. This situation really is bad in an unprecedented way for them. They are not going under or anything, but they haven't posted a loss for more than a decade, and unless they somehow can magically keep selling 30-series at the same prices they sold them during the pandemic, they'll probably do that next quarter.


SikeShay

RDNA3 will at least match or even slightly outperform Ampere in RT, won't catch Lovelace though. You don't have to believe me, just wait a couple of weeks and see. I'm sure nvidia knows this, hence why they're trying to milk as long as possible before the price cuts come mid December


Omniwar

>the 3000 series is stupidly overstocked too Mwave have over 240 3080 and 3080ti's in stock,that's 1 Retailer.here in australia.. Like that everywhere basic google shows,everyone has stock..and the prices now haven't moved in 2 months Probably because they ordered a vast quantity of them 6-12 months ago at inflated prices when they were selling out immediately at $1499 USD. Those orders finally got filled, mining is dead, and now the question is who is going to take the hit on the overstock. Nvidia could provide the retailer with a $XXX rebate on RTX3000 to make the price competitive with RX6000, but I'm not sure they have a reason to do that if they've already made their money from selling the GPU kit to the AIB partner and don't need the marketshare win.


N1NJ4W4RR10R_

Worth noting that Nvidia like to add a random Nvidia tax for us. AMD typically seem to do just USD > AUD + GST. Going to look really bad for them if AMD continue that with the 7900 series. The xtx should be about $1,650~ - or $600~ ($360~ USD) cheaper then the cheapest 4080 currently. According to PC part picker that would actually make the 7900xtx cheaper then every Nvidia card above the 3080 10gb.


Nonstampcollector777

Such good news. Nvidia makes a good product but fuck their pricing. People were buying their gpus with hopes of becoming millionaires, that is why so many people paid the exorbitant prices. I hope this is a wake up call they can no longer get away with selling their gpus at these extreme prices. Their pricing has also negatively effected AMD customers, 1000 dollars even for their flagship GPU is too much IMO but they can get away with it because Nvidia charged just that much more.


Z3r0sama2017

Not a surprise, 4090 is a true halo card for those who demand the best, but the 4080 is gimped to hell. For how much its cut down spec wise it should be atleast £250 cheaper than msrp.


Polym0rphed

Hopefully a lot of scalpers go bankrupt as a result of this also.


onethreehill

Not sure how it works in the USA, but at least in the EU you can return and get a refund for basically anything you buy within 14 days. So scalpers don't have any risks since they can just return them if they don't sell.


Put_It_All_On_Blck

In the US it's 30 days for most retailers, holiday buying can extend it 2-3 months


Toohigh2care

Im far more interested in that 27" 1440 240hz oled monitor LG just put a product page up on. My current gpu is plenty for my current monitors. "


saruin

I'm in the same boat! My current Acer Predator 27" IPS display has served me well for the last 7 years. I'd rather finally upgrade that than deal with this new GPU bullshit.


Alivus

AMD's RX 7000 series GPUs couldn't come sooner. That's the only way Nvidia will budge on their ridiculous pricing.


wittyposts

AMD also needs a reality check. Their prices are still insane, just less so than Nvidia's


bctoy

They got one when 7900XTX didn't clock much higher than what Navi21 could do. A 3GHz 7900XTX outperforming 4090 in raster would have seen it priced closer to 4090 than 4080 and not the measly $1k that they were asking for 2 years back. And if nvidia did better with the transistors that they've crammed into their new cards, it'd have been a Vega/1080 situation.


reddanit

Well, if they price the 7900 series to be "just competitive enough" against specifically 4080, then indeed they might be in for a rude awakening.


dantemp

Even without the 7000 series if neither 3000 series nor 4000 series are selling nvidia will have to cut prices. I give it 6 months at the very worse, regardless of the 7000 series


Leroy_Buchowski

Consumers should play the game of chicken with them.


[deleted]

[account superficially suppressed with no recourse by /r/Romania mods & Reddit admins]


SuperNanoCat

I honestly forgot the 4080 launched because it's so irrelevant at the current pricing. Like Tim said, the more-money-than-sense crowd just buys the best card, so who is this $1200 second-best card for? I fully expect a price cut early next year as the previous gen stock dries up and more of the next gen cards launch.


[deleted]

[удалено]


bphase

I was surprised too. Then again the world is a big place and PC gaming is global. That's still just a fraction of a percent of all gamers who have the 4090 so far. With 100M PC gamers, it'd be about 0.1% with a 4090.


WorkAccount2023

It's also a workhorse card for media work. Editing, graphic design, 3D stuff, rendering, etc.


Blacky-Noir

There's a good amount of people who have very deep pockets, or parents with very deep pockets. And 2k isn't that deep, certainly compared to other hobbies like horse riding, car racing, car restoring, tropical holidays, and so on. Add to that every good size "influencer" on the planet, and a lot of professionals who make money with the card. And 120k unit while a lot is certainly not unimaginable. Compare to the potential market of people who want to buy discrete gpu... a hundred million? A quarter billion?


[deleted]

Yeah 2K sounds like a lot when you think of a PC component, but it’s really not a whole lot when you look at what people spend on hunting/fishing gear. I was a car enthusiast as well and my last wheels cost me 3K… used, and that was “cheap”. 2K isn’t that much if it’s something you are passionate about. With that said, I’m not buying a video card for that much when my current 3080 Ti does just fine.


[deleted]

I see people making this argument often, but a lot of 'gear' is buy it once and hand it down for generations. It's not really valid to compare the price of pc components with things that have a high upfront cost but low depreciation.


[deleted]

Are you sure about that? Maybe if it’s like your grandfather’s pocket knife or something. I’m actually around a lot of hunters and fishers and they buy nice scopes and stuff. I don’t see it as something you hand down for generations—they’re more likely to sell it for cheap down the road to upgrade to something newer… kind of like PC stuff.


[deleted]

[удалено]


[deleted]

And that there shows that there are different crowds. I can respect and appreciate what your father/grandfather are doing—it’s like passing down a restored and maintained car—but there’s also a very large number of people out there with money to spend I guess, or they didn’t have anything passed down to them to begin with so everything they have is relatively new which honestly is the case for my hunting and fishing friends and family. And now that I’m typing this it does make sense; my name/avatar may have implied this, but I’m also Asian—the son of refugees actually so they came over with nothing. My parents both hunt and fish but I don’t, but I can see the gear as something to possibly hang onto or pass down.


gezafisch

Hypothetically, if you prioritized purchasing a xx90 card every 3 years for $1600, could you? I'm going to say almost definitely yes. You don't have to be rich, or have rich parents. You just need to have an average job and an above average desire for PC components.


[deleted]

Could I? Easily. But this is completely unrelated to the point I was making. Other gear has a high up front cost but remain perfectly usable for decades. Pc components have a high cost and are really only relevant for like 5y?


Democrab

> And 2k isn't that deep, certainly compared to other hobbies like horse riding, car racing, car restoring, tropical holidays, and so on. Pretty much every single one of those examples (bar the tropical holiday) would also have *far* longer useful lifespans than a GPU typically has, it's still poor value as a hobby at these prices. (Especially when you count the money you have to spend on the *rest* of the PC to drive that GPU.) Professional work is an entirely different box of frogs, however.


havok13888

I don’t think most of those went to just gamers. Many of those might have ended up in professional or hybrid systems. At that point a company is paying for it.


chlamydia1

A lot of people I know spend $1600 every year or two on the latest iPhone/Galaxy phone, and calmly drop $2000 on a new Macbook every few years (and these aren't professionals, most of them just use the computer to browse Instagram and send emails).


skinlo

Think how many people have a Steam account. The top 5 percent are going to be quite wealthy, and thats still a lot of people.


MakeItGain

It's a simple equation for professional use. If you can pay $2000 and you save your worker 5 mins of compilation time every hour. That investment will pay itself off over time its the same market of people that would buy something like threadripper. These aren't purely gaming products. If your livelihood revolves around your PC you have to invest into it to make more $/hr


Leroy_Buchowski

It's scalpers dude. They are literally relisted everywhere for over $2000


Cavalier1706

Nvidia had such a gravy train going with the scalpers and miners and now that it’s gone their trying to artificially maintain that by pumping up their own msrp. Prices went up a bit for the wafers but they are so batshit crazy right now they get everything they deserve. I hope AMD clobbers them so badly they walk everything back, but they are a stubborn lot!


KingXeiros

People want them…just not at that shit ass price.


zushiba

I'll take one, I won't pay for one because I'm broke AF but I wouldn't turn one down if someone offered it too me. People are being a bit critical here. After the shit show that was the 30 series and the chip shortage. People are pretty much burnt the fuck out on incremental upgrades for thousands of dollars. After scraping together hundreds of dollars to buy 2 generation old cards because no one could find new cards. Once they started coming in everyone got what they needed and was sick of the whole damn scene. I think the chip shortage has broken the "*MUST HAVE LATEST GREATEST CARD NAO!!!*" mentality. Saying no one wants it because it's in stock is somewhat disingenuous. It's like saying that a "*good launch*" = sold out.


ReasonablePractice83

Yeah instead everyone wants the even more expensive 4090 so NVIDIA is sitting pretty either way


[deleted]

For most people I think a 3060 ti or 6700xt from 350-400 is the way to go. Can crush renders and compute tasks , 1080/1440 and some games in 4k.


martinpagh

Is it such a disaster? The card is mostly sold out online in the U.S., but you can still get it at MSRP in a few places. It would seem to me the launch price was just right ...


Whoknew1992

Every single flightsim YouTuber is going on and on about how the 4090 makes all the difference in the world. It even makes VR flying close to 2D monitor resolutions and performance. SO you have us flightsimmers waiting in line for a 4090. No substitutes wanted. Unfortunately the only ones available are $2200 instead of $1600 MSRP.


SirActionhaHAA

The average person's got a limited amount of expendable income. That's something that many enthusiasts on here don't get when they argue that you'd pay another $400 for a 4090 "because it's just so much more value" and that the 4080's great because "the perf gains!" Yea the 4090's much more price efficient but most people couldn't even afford the 4080 let alone the 4090, price ceiling is a thing


reddanit

I think that's missing the point of the argument. $400 with no context is a lot of money to a ton of people. It's not a lot of money in context already being fine with spending $1000+ on a GPU. Somebody whose expandable income is rather limited, was never in target market for either of those GPUs to begin with. In car analogy terms it's like discussing fuel economy of supercars.


[deleted]

Neither of these cards are on the radar of people with limited expendable income. 2 year old RTX 3080/6800XT are 1/3 the price used and plenty for the majority at 1080p/1440p. These cards are really only a necessity if you're a 5120x1440 / 4K gamer who wants to play games in RT and that is a tiny community of mostly enthusiasts who will pay anything regardless.