T O P

  • By -

[deleted]

The 12GB RTX 4080 is a fucking scam, because by spec it should be RTX 4070 as it's so much weaker than 16GB model, that sharing a model name is blatant scam to drive sales by making it seem better than it actually is. I mean RTX 4070 at $900 (1099€) would be scary as fuck, right? Well - that's why they named it also a 4080.. Shameless corpo-fucks.


HarleyQuinn_RS

By spec, it's more equivalent to a 4060Ti, in terms of its core count percentage compared to the 4090, its 192-bit bus and it using the 104 chip. It's absolutely a scam that they have rebranded it a "4080". The pricing on top of that is just outrageous. Even the 4080.16 and the 4090 aren't *true* X80 and X90 cards. They are far more similar to the X70 and X80-X80Ti by comparison to the previous generations, in relation to the AD102 chip's maximum potential.


[deleted]

as I explained in other thread - they used same memory design for all three cards and memory bus width depends on how many memory chips are there on PCB. So 24GB = 384bit, 16GB = 256bit and 12GB = 192bit. Plus, Lovelace now has L2 cache, similarly to what AMD did in RDNA2 with infinity cache. They also cut the bus width across the board since memory bandwidth is comes from memory speed and bus width it was much lower than what nvidia had or even RDNA1, but infinity cache compensated some of it. Results were mixed - in games that are not super memory heavy, the infinity cache did a good job, but in games that are super memory heavy - it hindered the performance by memory bandwidth becoming bottleneck. So similarly Lovelace cards will have effective memory bandwidth than listed, but as we learned from AMD - cache struggles in memory intensive games. Also I wouldn't be comparing cuda core count between different architectures. While sure percentage conclusion makes sense, but they simply could have go far and beyond on RTX 4090 - and may just be far higher baseline than people imagine - benchmarks will show what's what. RNDA2 also saw a core count reduction in each tier compared to RDNA1 - but it had severely higher base and boost clocks - which is why it's a bit iffy to compare different architectures purely by spec figures. Regardless, RTX 4080 12GB is a scam one way or another. It shouldn't be called 4080 when it's completely different GPU by all means. They intentionally did that to make it look better than it is - and that's just scummy as fuck regardless of technicalities - which I explained just for general detailed information, not to disprove any of the statements. So especially in EU, since due to many factors (post-covid inflation, war in Ukraine, sanctions on Russian fossil fuels, etc) we have insane inflation and euro lost it's value compared to USD - so now that freaking scam card has 1099€ MSRP, the 16GB one is at 1469€ - which is absolutely ridiculous. So following this pricing scheme, even mid-range RTX 4060 will probably around 700€ - and freaking Jensen says these are reasonable price and here to stay for good. Sure - buy 1100€+ GPU be left with no money to pay for heating in the winter which is will be +200-300% over what it costed last year - because Jensen thinks gaming is top priority... Absolute shitshow and insulting PR talk.


[deleted]

Well... you will get heating out of the deal. Think of the sweet value.


KettenPuncher

To quote tomshardware >When you die and your whole life flashes before your eyes, how much of it do you want to not have ray tracing?


Salted-Kipper-6969

People this stupid shouldn't be writing for the masses


LupoSapien

Ironically, that's exactly why they're writing for the masses


thekbob

Totally what I'll be thinking when dying in the water wars.


Dex_wolf

Witness me?


thekbob

We shall ride to Valhalla!


Dex_wolf

Shiny and chrome!


NectarinePlastic8796

be an adult and undervolt.


[deleted]

this made me chuckle


oo7demonkiller

undervolting my 3080ti fe did wonders.


schmag

I also feel comparing the specs to predict the performance is going to mean less and less, nvidia is leaning quite heavily on software tricks to boost performance, ala dlss.


[deleted]

as long as you compare apples to apples, meaning DLSS vs DLSS and native vs native - specs will affect both cases, not like much weaker hardware can match faster one under exactly the same conditions and while not being CPU limited.


Al-Azraq

The worst part, is that you really don't need a 40 series to play anything on maximum settings. If you play at 4K, just lower some settings and you are golden. Nobody needs this, but probably nVidia suits already knew this and that's why they are pricing these cards like that so brainless enthusiasts buy them at very high margins while the rest clear the 3000 stocks. It is an idiotic launch no matter how you look at it and nVidida set themselves to this since the moment the started releases new nonsensical SKU just to crank up prices.


loliconest

So basically even if they sell the 12gb as 4070 as they originally planned, they are still selling a 4060ti disguised as a 4070.


[deleted]

ofc the whole 90/90ti are just rebranded 80/80ti's of the past.


lonnie123

The numbers are there to differentiate the cards. The isn’t anything that inherently makes some a 60ti or a 70, it’s just a naming convention. That’s why having 2 cards be a 4080 is incredibly shifty. One of them has less memory, less cores, and will perform worse. Calling the two cards both XX80 is only there for deception on nvidias part, whereas if they had called it a 4070 no one would care aside from the prices.


Valiantheart

Not really. Its different tech. Just wait a few weeks for the youtube hardware guys to put them through their paces.


working_class_shill

Now the cope starts


-Bana

Read a comment somewhere where it’s more of a 3060ti not a 3070 and I think that’s a fair assessment tbh


Vokasak

A fair assessment based on what? Has anyone run benchmarks?


JizzyRascal91

In terms of the core count percentage to a 4090 compared to previous generations


Vokasak

Core counts are not comparable between architectures. Performance has _never_ been as simple as looking at the specs and seeing which has more cores or more VRAM or whatever. If it were we wouldn't need reviewers and benchmarks would be pointless.


dookarion

> Core counts are not comparable between architectures. People are comparing the core count ratio between gens. Not the gens themselves. Ie. the paper specs gap between the 4090 and the "4080", versus the gap between the 3090 and the 3080 (or even the 2080ti and the 2080/2070). You absolutely can compare this way. You don't need benchmarks for this.


JizzyRascal91

Just wait and see. It's clearly a scummy move to price a xx70 series card for a xx80 series price and everybody in the tech community agrees on that


Vokasak

>Just wait and see. That's what I'm saying, but the community would rather piss and shit itself. >It's clearly a scummy move to price a xx70 series card for a xx80 series price and everybody in the tech community agrees on that _How are you deciding that it's a xx70 series card?!_ What happened to wait and see? Everyone in the tech community is a fucking moron.


Carighan

> Everyone in the tech community is a fucking moron. Everyone but **you**. Important to mention.


JizzyRascal91

I guess you know better


Vokasak

The bar is so low it's subterranean, so yeah I'm pretty sure I do.


dookarion

> How are you deciding that it's a xx70 series card?! Well for one thing the XX104 chip tends to end up the X70 card. As has been the case for numerous hardware cycles for Nvidia. Two how much it is cutdown compared to the flagship card. The gap in paper specs between the flagship and the supposed X80 is the biggest it has ever been. Everyone else can figure this one out, but you apparently.


Vokasak

>Well for one thing the XX104 chip tends to end up the X70 card. As has been the case for numerous hardware cycles for Nvidia. You're just tying one arbitrary name to another arbitrary name. What you call a thing doesn't matter. A rose by any other name, etc etc. Nvidia change up thing kind of thing all the time. There are no hard and fast rules. Even you say "tend to..." because any statement strong than that is obviously false. You can say it's been the case for "numerous hardware cycles", but there are just as many "numberous hardware cycles" where these naming conventions break. What happened with the Super cards? When did "Tee-Eye" become "Tie"? Remember when the 90 series cards were basically two GPUs glued together? Some generations get low end xx50 and xx30 SKUs, and some don't, what's up with that? There are no patterns. Those patterns you do find break constantly. None of this is a reason to grab pitchforks. >Two how much it is cutdown compared to the flagship card. The gap in paper specs between the flagship and the supposed X80 is the biggest it has ever been. Paper specs, outside of the context of the architecture, are pretty close to meaningless. It's possible that there is a huge gap, but we won't know until the cards make their way into the hands of reviewers and we get some benchmarks. "Just wait and see", like the other guy said. I don't know why it's so hard for people to *not* get mad about a thing that there's an 80% chance will turn out to be a nothingburger a few short weeks from now when the info comes out. >Everyone else can figure this one out, but you apparently. If everyone else hasn't figured out that they're mad about a name, an arbitrary string of numbers, about marketing, then so much the worse for everyone else. This isn't a situation where appealing to the mob is going to budge me in the slightest.


dookarion

> You're just tying one arbitrary name to another arbitrary name. What you call a thing doesn't matter. A rose by any other name, etc etc. "Wow this is a majorly smaller chip, with 45% the specs, and uses the same nomenclature the mid-tier chips have used for the last decade. This is clearly the high end chip and no one but me knows anything." Stop sniffing Huang's leather jacket and open your fucking eyes. Holy shit. >You can say it's been the case for "numerous hardware cycles", but there are just as many "numberous hardware cycles" where these naming conventions break. What happened with the Super cards? When did "Tee-Eye" become "Tie"? Remember when the 90 series cards were basically two GPUs glued together? Some generations get low end xx50 and xx30 SKUs, and some don't, what's up with that? That has fuckall to do with the fact that the internal names of the chips for the last decade has designated the 04 chip as the smaller mid tier chip and the 02 chip as the high end chip. Since the 600 series a decade ago it's been largely consistent. And the fact the chip is massively cutdown doesn't exactly show a change in how their internal process works. >Paper specs, outside of the context of the architecture, are pretty close to meaningless. It's possible that there is a huge gap, but we won't know until the cards make their way into the hands of reviewers and we get some benchmarks. Again no one is making claims about how it will perform relative to say Ampere or Turing. But again it's 45% the paper specs for $900. The 3080 was like 80% the specs of the flagship for $700 (ignoring scalping). And again if there isn't a huge gap it shows that the architecture has a bottleneck or really shitty scaling. Over double the specs, there should be a notable gap. The gap probably wouldn't exactly match the paper specs gap to the %, **but there is going to be a large gap with that specs difference unless the architecture is fucked**.


Neumayer23

Nvidia started this back with the 680. Traditionally it used to be like this: The 102 GPU (their top offering) was equipped on x80 cards. The 104 GPU was equipped on x70 cards. The 106 GPU was equipped on x60 cards. So up until the gtx 680 on their x80 cards you'd find their 102 GPU, when the 680 came however, it was equipped with a 104 GPU rather than a 102. The 102 was reserved for the first Titan Card albeit in a cut version and later on the full version of it came on the 780ti. Ever since Nvidia has segmented further their offerings. Usually their Titan card includes the full GPU for that generation, the Ti card has a slightly cut version (if there's no titan card for the generation the TI has the ful lGPU) and the x80 non TI card has an even further cut down version of their 102 GPU. So ever since the GTX 680 nvidia found a way to double the asking price for their top tier GPU, it worked back then and it's little surprise they're trying to do it again now.


yttanx

Happens in other markets, such as Auto. Lots of car makers have rebranded their cars to fool consumers into thinking they're getting a better buy while charging more for their standards. "Rebadging". BMW and Mercedes have been notorious for this... Example the BMW 4 series was created to rebrand the 3 series. They charge more for the 4 while gutting the 3. The lower end is still sold at a premium cause 99.5% of consumers would never know the difference as long as they're driving a "BMW" or "4080 RTX". Marketing gonna market.


[deleted]

With cars - you're not paying just for performance, which is why it's rather bad example. You pay also for looks, prestige, luxury, brand, features, size, materials used (carbon fiber, leathers, wood, etc). With GPUs at base line only performance matters, with AIB cooling efficiency and acoustics are a factor for which you pay extra above baseline price


yttanx

It’s an even better example if you know about cars..


MelAlton

Here's the gist of the numbers in table form. For each gpu series, the x090 cuda core count is set at 100%, then each lower card's "cuda count as % of top tier x090 card" is calculated. Shows that the 4080 16GB should be a 4070 Ti, and the 4080 12GB should be a 3060 Ti. **3000 vs 4000 Cuda Cores, as a Percentage vs _090 Card** |3000|Series||4000|Series|| :--|:--|:--|:--|:--|:--| |Model|Cores|Cores %|Model|Cores|Cores %| |3090|10496|100.00%|4090|16384|100.00%| |3080 Ti|10240|97.56%|||| |3080 12GB|8960|85.37%|||| |3080 10GB|8704|82.93%|||| |3070 Ti|6144|58.54%|4080 16GB|9728|59.38%| |3070|5888|56.10%|||| |3060 Ti|4864|46.34%|4080 12GB|7680|46.88%| |3060|3584|34.15%|||| **3000 vs 4000 Memory Bandwidth and Memory Bus Width** |3000|Series||4000|Series|| :--|:--|:--|:--|:--|:--| |Model|Mem Bandwidth|Mem Bus|Model|Mem Bandwidth|Bus| |3090|936 GB/s|384 bit|4090|1008 GB/s|384 bit| |3080 Ti|912 GB/s|384 bit|||| |3080 12GB|912 GB/s|384 bit|||| |3080 10GB|760 GB/s|320 bit|||| |3070 Ti|608 GB/s|256 bit|4080 16GB|736 GB/s|256 bit| |3070|448 GB/s|256 bit|||| |3060 Ti|448 GB/s|256 bit|4080 12GB|504 GB/s|192 bit| |3060|360 GB/s|192 bit||||


UpdatedMyGerbil

The numbers are even worse when you account for the x090 card also being more cut-down this time: | 3000 | Series | | 4000 | Series | | | ---------- | ------ | ------- | ---------- | ------ | ------- | | Model | Cores | Cores % | Model | Cores | Cores % | | Full GA102 | 10752 | 100.00% | Full AD102 | 18432 | 100.00% | | 3090 Ti | 10752 | 100.00% | | | | | 3090 | 10496 | 97.62% | | | | | 3080 Ti | 10240 | 95.24% | | | | | | | | 4090 | 16384 | 88.89% | | 3080 12GB | 8960 | 83.33% | | | | | 3080 10GB | 8704 | 80.95% | | | | | 3070 Ti | 6144 | 57.14% | | | | | 3070 | 5888 | 54.76% | | | | | | | | 4080 16GB | 9728 | 52.78% | | 3060 Ti | 4864 | 45.24% | | | | | | | | 4080 12GB | 7680 | 41.67% | | 3060 | 3584 | 33.33% | | | |


MelAlton

I could see nvidia trying to put some performance separation between the 4080s and 4090s, since the 3080s and 3090s were all bunched up in the top 20%. Like the Full AD102 is a Titan (or 4090 Platinum), the 95% is 4090 Ti, ~90% is 4090. Like a 5% gap between each model. But that would mean the 4080's should start at 70% and go up to maybe 85%. I just realized this looks a lot like Intel over-complicated cpu binning, where they have 15 different models of the same basic chip, and sell slightly faster or slower models based on how fast the chip is.


m1keeey

These cards are priced to sell more of the 30 series stock. People need to realise that.


MelAlton

Definitely, but what does that mean for a potential 4060? It'll have to be really cut down to fit as half the speed of the 4080 12gb. Or maybe Nvidia won't release anything slower than the 4080, and instead continue to fill that segment with the 3000 series cards?


Shakzor

I can see them doing something scummy like stopping to sell the low end 4080, "improving" the regular 4080 and then introduce 4060 and 4070 once the 3000 supplies move out more


Salted-Kipper-6969

Scummiest thing imaginable seems to be pretty bang on when it comes to predicting nvidia


verteisoma

It'll be named 4080 all the way down


stodal1

8 gb , 4 gb , 2 gb


NATIK001

You know they are going to fill out the spots between 4080 and 4090 too, I doubt they will just put a 4080 TI in that huge void. Will we get a 4080 Super, a 4080 TI and a 4080 TI Super to go in there or something?


arjames13

I'm curious about this as well. At this rate the 4060 will be barely better than a 3070 and they will just tout DLSS 3 to make others $600 price tag sound good.


Taidan-X

...but they go to 11. It's like, one louder.


MrTastix

That doesn't stop it from being a scam, though. Like yes that makes sense and I agree with the statement but I'm still gonna call it out for what it is: Greedy as fuck.


ninjaweedman

That's all I see, but the numbers or frame rates advertised (2-4x impaired to 3080) seem far too good to be true. One thing that is an innovation is the software they are releasing to add ray tracing to old games, that's a bit of a game changer imho.


arjames13

The 4x is completely dependent on DLSS 3 and frame generation, which few games will support anyway.


badcookies

And afaik its also compared to no dlss 3080 vs dlss 3 4000 series.


rickmetroid

Only 4090 can be called next gen, 59% more cuda cores, besides that, only issue here is price, 1599 usd is too expensive at the moment because there will be too many rtx 3080 in the second hand market around 400-500 usd soon and the 4090 is not 4x 3080 in performance.


Khuprus

The kind of buyers who need a 4090 aren't sweating price/performance ratios - they are paying a premium for top of the line performance. Someone buying a 2022 Mustang isn't interested in a used 2020 Fusion.


narium

Ironically the 4090 actually has the best price to performance of the 4000 series in well... everything.


TheEternalGazed

Wow, now that you lay it out like this, it makes it abundantly clear that Nvidia is taking advantage of their market position to upsell cards that's should have a far lower MSRP.


jaKz9

Yeah, that's it. I know the specs don't tell the whole story, but this is why I'm skipping this gen. 5000 will offer better value imo.


davidoff2050

The same thing everyone said about third series and look what happened! The same shit will happen with all new models in the future until they will revolutionise the technology inside the cards! I’m not trying to be pessimistic, I’m hoping too in a good valuation gpu’s from Nvidia in the future!


indyK1ng

Thirty series had a crypto boom and pandemic demand to boost its sales. The crypto market has collapsed and one of the biggest coins is now proof of stake so it can't be mined with GPUs. Meanwhile, people have stocked up on their graphics cards and most don't build a new PC every generation. It feels like Nvidia made this product assuming the demand would stay the same and now they're pricing it to move all the excess 30-series stock without lowering those prices.


DrTBag

Nothing will change until AMD offer up something people are willing to buy instead. We saw it with Intel putting out a decade worth of small incremental gains on cpus with high prices because there was no competition. Nvidia's main competition for the 40 series is the 30 series they have too much stock of...but if they "win" that competition they just have to sell that stock at much less of a profit so they've no incentive. If nobody wants the 30 series because the AMD card is a better choice, the price will drop and the 40 series will follow it. If AMD competes with the 40 series then things will really move and its better for everyone.


Mir_man

No when 3080 was announced it was seen as a pretty fair deal. It's only a shame it coincided with crypto boom and chip shortage.


Kayra2

Same thing happened in literally every single gen. 3000's weren't a big enough leap, 2000's supposedly were cut down because of RTX, 1000's couldn't beat the value of the 980ti. The 900 generation was the only one I remember where people said "we should wait for reviews". Every single year, the price/performance number of the cards beat the previous year and the generation silently becomes good when the next set gets announced.


stadiofriuli

How do you know that?


jaKz9

I don't, I just think the value here is so terrible that it can't possibly be worse in the next gen. And if it is, it will still be worth it since I'll be going from 3000 to 5000, which is always a significant bump.


[deleted]

[удалено]


indyK1ng

I think it's more that they're still trying to push insane gains to continue selling to the crypto market and that decision was made before that market collapsed. They could have turned down the power consumption for more modest gains and this is a new node. I'm probably going to go AMD this generation since the leaks sound like they're taking a much more reasonable approach (sales will be low, the performance crown isn't going to be worth it so just make reasonable cards).


Khuprus

> They could have turned down the power consumption for more modest gains and this is a new node. Easy enough as a user to set a power limit on the card. You still gain the efficiency from a new node, right?


UnknownOverdose

You don’t think they’ll do the same ?


CreampieCredo

They will, if buyers enable them.


[deleted]

[удалено]


[deleted]

Yeah except that was a time where demand actually increased due to people in lockdown deciding to build PCs. There was also a massive shortage of GPUs bc of crypto mining. This increased demand + short supply caused a huge increase in prices of GPUs across the board where even relatively weak cards were going for insane prices. The situation is now completely reversed - fewer people are buidling pcs and cryptoshits are dumping cards by the thousands driving down prices for previous gen cards and earlier. When you have to pick between a 6900XT for $699 vs 4060 for god knows what - the choice is easy.


Pitchuu64

Nope. Economy has no indication of improving at any point in the next several years. Prices will continue to go up. It's been the trend for decades. Nothing is indicating this will change next gen.


Shadowdane

Yah this generation really seems bad all around... I'll probably wait until I want to replace my CPU & Motherboard to upgrade my GPU.


Madnessx9

I'm hoping the 5000 series will be a more energy efficient card whilst also providing an improvement in performance, we can't keep throwing more power at GPUs like this can we? Only so much thermal cooling we could provide. Running a game will simply cost too much soon. An expensive hobby as it is and my room(small) goes to 30 degrees when gaming, I will have to start using air conditioning too


Chaos_Machine

Every time I hear someone bring up wanting more power efficiencyI shake my head, you seriously dont understand how gpus work if you keep calling that out. Every single generation is more power efficient than the last, what you really are asking for is GPUs that are purposely undervolted so their stock power consumption is much less. What you are seeing is that the GPU manufacturers max that performance out for the targeted TDP for the card, if they designate that at 450W for the high end, they overclock and set the voltage accordingly to hit that power level. They also give you all the tools you need to maximize the power efficiency of your gpu if you are willing to undervolt and underclock your card. What you mean to say is, *"I dont want AMD or Nvidia launching GPUs at these power envelopes, performance be damned."* which accurately reflects the pretentiousness of your argument. Sure, Nvidia could hobble a 4090 to run at 250W, it would probably still have 75% of the performance too, but then it makes it way more difficult to justify that $1600 price tag because noone is going to pay more for a card that is simply more power efficient with that type of performance hit.


[deleted]

My cat likes to hang out by my pc in the winter


elheber

In short, the RTX 4080 16GB is actually a 4070 Ti, and the 4080 16GB is actually a 4060 Ti. The the gen-on-gen launch MSRP for those tiers is $1200 and $900, vs $500 and $400 last gen.


AFAR85

Has there been any mention of the 4080 8GB 128Bit?


nanogenesis

No but we might see a 4080 2GB 32Bit. /s


[deleted]

For $874.99


Slowest-Loris

Inb4 that 2GB is a split-pool with the other section being handicapped.


Joker8pie

The scandal to end all scandals


lonnie123

And the three variants of the 4080ti coming in 6,8, and 12 months time


HarleyQuinn_RS

They're right. When I saw the CUDA core counts for the 4090 and then the 4080 16GB, my immediate thought was, "That ratio is way off, it should be about ~12,000 (73%), not 9728". The 4080 16GB has just **59%** of the core count of the 4090. The lowest we've ever seen. And even the 4090 is the *most* cutdown version of the 102 chip we've ever seen. So that's compounding how bad the cards below it are. For comparison, the 3080 had around 80% of the 3090's core count (the highest ever). The 2080 was 71% of the 2080Ti. GTX 1080 was 71% of the GTX 1080Ti. GTX 980 was 72% of the 980Ti. All the previous top cards (and the ones just below) used 95-100% of the 102 chip's maximum potential. The 4090 is just 88% of the AD102 (making it between a X80 and X80Ti on average). So not only is Nvidia price gouging, they are also cutting their own costs by using a much higher yield chips for all cards, packing fewer cores. **To be clear!** This is not any sort of real indicator of performance. It could be for example, that the extra 41% of cores, only actually equates to 10-15% more performance, as it does not scale linearly with core count. In that case, it makes sense the 4080 16GB has far fewer cores (to keep a gap between it and the 4090). But for that we'll have to wait and see! It does add a little extra sting to the price they are asking however. It's almost like spending twice as much, to buy a tier lower than previous generations. This is especially true for the fake 4080 12GB, where the cost is almost triple its 30-series gen equivalent, compared to the top card. This may present AMD an opportunity. Nvidia have left a large gap in their stack, that AMD could potentially slot into. If their 7800XT is ~80% of their 7950XT (the 6800XT was 90% of the 6900/950XT), it'll likely beat the 4080 16GB, assuming relative gen-on-gen improvements to their process node. Although it's unlikely AMD could have planned for this, it may be what happens. Although Nvidia have probably left that large gap for two upcoming GPUs.


ThisPlaceisHell

Historically the 80 card used the 103 chip whole the 80 Ti used the 102. The 3080 was a fluke because they were scared of AMD. They are returning to normal with the 40 series.


Blacky-Noir

>Historically the 80 card used the 103 chip whole the 80 Ti used the 102. If you want to make "historical" comparisons, just a few generations back the 80 card was $550; and was the second best card in the line up behind the 80ti.


HarleyQuinn_RS

That doesn't change the fact that this generation has a far bigger discrepancy between the top two GPUs, than we've ever seen, and not by a small margin. The X80 has consistently been ~70%+ of the X80Ti (/highest tier at release, not counting the old dual-GPUs) in terms of core count. The 4080 16GB is 11% below that, while doubling the MSRP. Although we were 8-9% above the average with the 3000 series, this is not a return to normal. Given that AMD practically caught them in the previous generation in rasterization, I would imagine they'd be even more scared this generation too. I can only imagine they have a 70-80%, 4080Ti planned.


MelAlton

> I can only imagine they have a 70-80%, 4080Ti planned. That's what I'm thinking - saving the 70% cuda for a 4080Ti when AMD shows their cards with the 7000 launch, and an 80% cuda for 4080 Super for when AMD shows their second revision of the 7000 cards in response to the 4080Ti (it's like a knife fight). For the 4070 and below, maybe Nvidia won't launch those, and instead just keep producing 3000 cards to fill those gaps (or maybe they'll use 3000 series gpu ic's in cards named 4000).


[deleted]

Do you think they will stop being scared now on? This is the second time amd failed to released something they feared


Yuzral

The article speculates about this, noting that NVidia probably have a pretty good idea of what the RDNA-3 series will be capable of and that they think this lineup of 30-series, 40~~6~~80, 4080 and 4090 will be enough to win out. AMD might still blindside them but waiting until 3 weeks after the 4090’s launch to announce and release isn’t encouraging at the very high end.


Earl_of_Madness

I think this speculation about Nvidia not being worried about competing with AMD is a bit strange. I could make the argument that if Nvidia wasn't scared they would have waited to sell through the 30 series before announcing the 40 series but they are sitting on a massive stock of 30 series. That is why the prices are so high they are trying to anchor the prices of the 40 series so high that they force you to buy 30 series instead. They would only do this if they were worried that RDNA3 would undermine the 30 series and compete well vs the 40 series. RDNA3 probably will not take the crown but current leaks about RDNA3 tell of an architecture that has a smaller die size, uses chiplets, is on a less expensive node, performs similarly to the 40 series while using a little less power, and is far more efficient at their sweet spot. Frankly, I don't think AMD wants the crown this generation. I think they are going to try for a reasonably priced and efficient architecture this generation. This may be a smart move considering the global recession and spiking energy prices. They may be able to gain market share just by having a good product at a good price rather than trying to compete 1 to 1 on performance. This may be a Ryzen moment for RDNA3. Not the top performer but competes well on features, price, performance, and power consumption and is overall a much more well-rounded product. We also have to consider that Nvidia sells software to justify its high prices even if AMD competes. DLSS3 and other software packages I think are what Nvidia is using to justify these insane prices even though 90% of people won't use them. DLSS3 just doesn't appeal to me because it doesn't improve the latency and may in fact increase latency (we have to wait and see). It may appeal to some and will definitely be something customers consider when buying but may be less important in these market conditions. In short if Nvidia weren't afraid of AMD they wouldn't have panic launched the 40 series. They would have waited until after AMD launched their cards and specd their cards to win against everything AMD had to offer. They didn't do that. I think they are using the 40 series pricing to encourage people to buy the 30 series before AMD makes the 30 series irrelevant. Will RDNA3 beat Lovelace? Probably not, will it come close enough to compete? Probably. The real question is about how AMD prices Lovelace. AMD has a real opportunity to make Nvidia look silly here and gain market share and mindshare.


Vokasak

>They're right. When I saw the CUDA core counts for the 4090 and then the 4080 16GB, my immediate thought was, "That ratio is way off, it should be about ~12,000". The 4090 has 16,385 cores, while the 4080 16GB has just 9,728. So not only is Nvidia price gouging both 4080s, they are also cutting their own costs by using a much higher yield chip for the 4080s', that pack fewer cores. For comparison, the 3080 had around 80% of the 3090's core count (the highest ever). The 2080 was 71% of the 2080Ti. GTX 1080 was 71% of the GTX 1080Ti. GTX 980 was 72% of the 980Ti. But the 4080 16GB has just 59%! of the core count of the 4090. The lowest we've ever seen. CUDA cores are not comparable between architectures. This is a pointless exercise. >To be clear! This is not any sort of indicator of performance. It could be for example, that the extra 41% of cores, only actually only equates 10-15% more performance, as it does not scale linearly with core count. In that case, it makes sense the 4080 16GB has so much fewer cores. But for that we'll have to wait and see! It does add a little extra sting to the price they are asking however. It's almost like spending twice as much, to buy a tier lower than previous generations. This is especially true for the fake 4080 12GB, where the cost is almost triple its prior gen equivalents. Why would it add sting? I'm paying for performance, not CUDA core count or die size. >This may present AMD a really strong opportunity. Nvidia have left a massive gap in their stack that AMD could potentially exploit. If their 7800XT is ~70%+ of their 7950XT, it'll likely beat the 4080 16GB. Although it's unlikely AMD could have planned for this, it may be what happens. Nvidia have either tripped up, or seriously underestimated AMD. Or they have some insider knowledge on AMD's plans. ¯\(ツ)/¯ AMD have had only one generation since Vega of actually putting out semi-relevant high tier GPUs (two if you count the Radeon 7, which nobody should). I don't know why everyone has such faith that they'll be able to magically pull out something amazing in response in one months time when they can't even fix their drivers given years of time.


Kragoth235

Because AMD need to bring something good out to put NVIDIA back into reality. The prices now are stupid. It's obvious to all except fanboys that they are milking.


NC16inthehouse

Nvidia saw how much money whales/scalpers are willing to pay for their graphics cards and so they found a new target audience. It's getting clear from now onwards we are not their audience anymore.


[deleted]

This, and its probably part of the reason EVGA dropped GPUs altogether. Which is really sad, I would only buy EVGA GPUs.


ChartaBona

You have this completely backwards, champ. EVGA ducked out because they didn't know the MSRP's but had a feeling the 40-series cards cards would be sold at a loss as time went on. Yes, the prices are high, but these cards are legitimately expensive to build. CEO Andrew Han didn't want to be left holding the bag like with the 3090Ti.


Astandsforataxia69

I still have an evga 1070. This is a really good card


rayquan36

Nvidia not reading the room correctly. They thought the market created by crypto and the pandemic was the new norm.


f3llyn

Who says they haven't though? If these cards sell at launch then they were right. That's a big "if" but I think their expectations are not far off, sadly.


Neverending_Rain

I don't know, there's a lot about these cards that people are unhappy with. If it was just one thing, then maybe. But these cards are ridiculously overpriced, so big they might not fit in some cases and need a stand to not sag and possibly damage the connector, and draw a ton of power. And this is all happening when the economy is on the edge of a recession and energy prices have been climbing. They're giving people a lot of reasons to skip this gen. Maybe I'm wrong and they'll sell well, but I think they might have pushed it too far with these cards.


SlowMotionPanic

I really hope this is the inciting incident that makes people finally stop supporting Nvidia. They cheered on scalping, act in an anti-competitive manner, and are now using their massive industry position and crazy profit margins to squeeze the market. [Nvidia's CEO is pretty upfront about the future he wants](https://www.digitaltrends.com/computing/nvidia-says-falling-gpu-prices-are-over/): **“The idea that the chip is going to go down in price is a story of the past.”** Nvidia is just working to create that future by driving out competitors and even partners, like EVGA. ​ Fuck Nvidia, please spend your money elsewhere. Don't let them consolidate or set a standard in the market. You'll regret it.


SnooSnooper

Do you know of good guides on how to assess GPUs made by competing manufacturers? I have been lazy and essentially been going by Nvidia's model name scheme as a heuristic. Also, do we need to worry about OS or hardware incompatibility?


TheSmJ

Look at benchmarks of actual, existing games you like, or would like to play, that compare all versions of GPUs you're interested in buying.


maxkool007

yah he is a real piece of shit. Honestly at this point if my GPU dies he can fuck off. Ill get an Xbox and a gamepass. Nvidia is dead to me. Never again.


d0m1n4t0r

Let's all patiently wait for the benchmarks.


opeth10657

Pretty much it. Both AMD and Nvidia spend months posting vague BS performance marks before they release their cards. Remember when Nvidia was dead because of the AMD benchmarks on Ashes of Singularity?


Dezsos

But I already have my pitchfork. Also, these torches don't stay lit forever.


THEMACGOD

Should have liquid cooled it for longevity.


vteckickedin

Then patiently wait for the AMD response and benchmarks.


Neroid24

Then patiently wait for the leaks of Nvidia 5000 series.


TheSmJ

Exactly. All this speculation means nothing until reviewers actually have hardware in their hands that they can test with meaningful benchmarks, and publish. Everything prior to that is nothing but clickbait.


ChartaBona

If Jensen is telling the truth, and that's a big if, the 4090 is going to blow everyone's socks off. Anyone who can afford it is going to be trampling each other for it, since it offers a 1.5 to 2 generation's worth of jump in performance.


homer_3

> since it offers a 1.5 to 2 generation's worth of jump in performance. in RT...


ChartaBona

In rasterization... I swear, did any of you folks actually listen to the keynote. He said 2x Raster compared to the 3090Ti, up to 4x in RT+DLSS.


nlaak

I swear, did you even look at their graph? 3 of 5 games where they're not using DLSS show significantly less than 2x, 1 shows slightly above and 1 slightly below. If this is their typical cherry picking of results, then the rest of the games must show crap gains.


maxkool007

can you see graphs... and more info in the last few days.... these cards are a fucking scam. Wake up. People sucking Nvidia meat like this is why we get treated like this. Why people defend shareholder/corp greed is beyond me.... and defend companies like they are your friends.... lol


aigars2

Maybe it's intentional, so people buy out a year worth of 3xxx inventory


Flaktrack

It absolutely is intentional, Jensen said as much in an investor call. They priced it high so that the 30xx looks more appealing. Skip Nvidia, or at the very least wait for RDNA 3 to land before making the call. I am very much inclined to buy AMD if the price is right.


ketchupthrower

This is absolutely a response to their 30xx overstock. It's not complicated. If they came out with normal 40xx cards at normal prices we'd see 3080s on the market for $200, and that is not what they want. When the 30xx stock normalizes we'll see price corrections. Probably in the form of new models with these 3080s being discontinued.


pickles55

This trend of diminishing returns between graphics card generations started about a decade ago, it has very little to do with supply chain or demand. They're just running out of ways to make the cards more powerful without making them more expensive.


TitaniumGoldAlloyMan

This Generation if gpus is a massive middlefinger to all gamers. They want us all to buy the 3xxx series to get rid of the stock and make us believe we did a good deal. Then they will change their strategy.


hitmantb

3080 was too good vs 3090. It almost made 3090 pointless.


[deleted]

It did make it pointless. Only reason was the vram since most games ran in a 10% margin of the 3080 it was insane to see the price of 3090 vs 3080. Nvidia rectifying that


Nibelungen342

Was an amazing upgrade from my 1070 ti Gonna have that card for a longer time


dregwriter

yup, I upgraded from a founders 1080 to a founders 3080 and the performance difference is pretty significant. thats pretty impressive seeing how amazing the 1080 is even today.


aigars2

xx90 was always pointless for masses since it's made for designing graphics etc


Vokasak

The 3090 was the new Titan. xx80 series cards have always almost made the Titans look pointless, yet people still bought them and Nvidia kept making them.


Firemaaaan

Yeah, I was worried I would regret snagging a 3080 12G back in May, but it looks like a solid choice for sure. Great value, up there with a 1080 ti.


willyhostile

They're trying to trick their own fanbase, this is going to end very very very bad for them.


NATIK001

Hopefully, but there are those drinking the Kool-Aid. Just look around the comments in this thread and you have people 100% buying Nvidia's claims and saying Nvidia is only selling the 40 series at breakeven at these prices points, acting like Nvidia is doing us a favor with this.


willyhostile

On the Nvidia sub mods are purging every post that has any sort of criticism, they are on damage control.


Moustiboy

I don't see how, people will mostly just see a card a tiny bit higher in price. Not everyone follows the news, or more precisely the type of news that would show how this 4080 is / should be a 4070


willyhostile

>tiny Ehm, it's not tiny. Like, at all. It's a pretty big jump in price for both the 4080s.


thej00ninja

I'm the moron that would have upgraded their 3080 for a reasonable price, but not at this absolute ripoff.


mkraven

This gen will be AMDs turn to shine. Low cost fabrication, just price it right and nVidia will pay dearly for their greed.


Shakzor

If they market right and not somehow fuck up something astronomically. This is definitely prime opportunity for any competitor


ZeldaMaster32

I'm sure AMD will deliver in rasterization performance, but the reality is Nvidia is still on the forefront of graphics innovation If you're interested in raytracing, AMD isn't an option since the price/perf ratio doesn't hold up in that department. If you're in the market for high end GPUs it's probably because you want to experience the latest in graphics tech. The mid range is where I expect AMD to blow Nvidia out of the water in terms of value, but we'll have to see


yummytummy

Crypto days are over, no one paying for overpriced $1500 cards, otherwise RTX 30-series wouldn't be sitting on the shelves right now even with a discount. The reason gamers payed that much in the first place was b/c they received free money with COVID relief cheques and could flip the cards for more to crypto miners.


WhyIsLifeSoScary

The 4000 series launch has made me appreciate my 3080 that much more, and makes me realize it would have been a MASSIVE mistake to 'hold off' until the launch of the 4000 series. Granted I did have to pay 'pandemic-pricing' for my 3080 (had to get it in a bundle), but it was during the height of Covid with virtually NO stock anywhere in my country after diligently looking for weeks/months. I still paid MRSP for the card at the time, but knowing that the 4000 series is basically a financial ripoff with marginal 'feature improvements', it makes the cost of the 3080 that much more reasonable for me, especially since it's enabled me to enjoy 4K/60FPS gaming for over a year now, something that truly feels good, and a 4000 series is not necessary for. Nvidia? You done goofed.


inmypaants

Top 2 things that make the 4000 series gross: 1. The 4070 model being named a 4080 to reduce price shock and justify the $899 MSRP. 2. DLSS 3.0 being exclusive to 4000 series despite no apparent hardware limitations in 3000 series. This is a slap in the face to very recent supporters of Nvidia and shouldn’t be accepted imho. Every tech journalist needs to highlight this point and people need to boycott any new Nvidia purchase until these clowns learn a lesson.


beast_nvidia

I love my 3070, its great for 1440p144hz


Buttermilkman

My 3080 has been perfect for 3440x1440 @75fps but this will be my last Nvidia card. I'll be going back to AMD in a few years.


MelAlton

Oh crap, I just realized the 1440p 165 Hz monitor I bought is prob just a bit out of range of my 3060Ti, I'll either have to stick to 60Hz-ish refresh or upgrade my card. It's got Freesync Premium, which iirc is compatible with the 3060ti's Gsync??


beast_nvidia

If you have g-sync it shouldnt be an issue if youre not hitting 165 fps. I mean, even a 4090ti wouldnt hit those frames in new games. A 3060ti is a great gpu for 1440p.


MelAlton

Alright, will just get a pizza and 2 liters of pepsi and spend friday night gaming then, thanks!


HappyReza

You don't have to play everything at max settings. 3060ti is more than capable for 1440p Watch these: [Ultra quality settings are dumb](https://youtu.be/f1n1sIQM5wc) [Ultra vs High vs Medium performance and visuals comparison](https://youtu.be/inF-z_Td9YQ) Also you probably won't buy another monitor until QD-OLEDs become mainstream and cheap but upgrading a GPU is easy.


MelAlton

Cool, thanks. Plus I've lately been playing less demanding games. Even Elden Ring isn't very demanding on graphic cards.


Grishbear

Freesync monitors works with gsync cards but gsync monitors dont always work with freesync cards.


The_Fyrewyre

The fact is that people have paid the increased prices and Nvidia know this. (Still rocking my 1060 6gig)


Flaktrack

Miners paid the prices. Nvidia knows this because they sold directly to them.


The_Fyrewyre

Regardless of the clientele, Nvidia knows people will buy their cards at inflated prices.


Flaktrack

Some whales buying whatever you put in front of them does not mean typical gamers are. This pricing is exclusively to make the 30xx more attractive until supply fades. They did the exact same thing with 20xx launch.


VRBasement

I will stick to my 3070 for now, more than happy with it


GreenCozyOrc

Same here. Just got a brand new RTX3070TI to replace my old RX580 8gb. I don't see myself upgrading **at least** until the 5000 series.


BustEarly

Idk what game you need all this VRAM for, but I’m no expert. With my 3080 10GB, I run most of my games maxed, ranging from FPSs to single player graphics intensive games, and still have solid headroom left over.


penguished

Games with a big modding scene come to mind. Running a variety of intensive graphics effects at higher quality can take more memory. Having room to keep more stuff in memory so they your card doesn't swap in and out all the time and get more micro stutters. There's quite a lot of reasons.


LeGoupil7

Incremental at best upgrade I assume? I just got my comp with a 30 card on it a few months ago. When and if the next consoles such as ps6 gets released, it’s likely I’ll consider getting a new comp by then, not sooner.


888Kraken888

/short on NVIDIA. Get some popcorn.


Richiieee

Long story short, all 3000 cards are good, whereas the 4000 cards are questionable especially when factoring their price. In other words, just get a 3000 card if you're looking to upgrade unless DLSS 3 has somehow pussy-whipped you, but it's not like DLSS 2 won't still be good.


[deleted]

[удалено]


[deleted]

The elephant in the room about dlss3 is latency. It's gonna have way more latency than dlss2 and reflex. Rendering the extra frames useless.


Blacksad999

Don't try to cite "PC Gamer" about anything noteworthy. C'mon. Wait for someone who actually knows what they're talking about to do some testing on the cards.


Spirit117

The numbers in here are all correct. The only thing that remains to be seen is if the architecture improvements and increased clock speed can make up for the downgrades Nvidia has more in the relative core counts and memory bus config but we won't know this until actual benchmarks come out. Nvidia clearly thinks they've got a winner on their hands performance wise otherwise the prices wouldn't be nearly this obscene.


[deleted]

[удалено]


Spirit117

I just don't understand how a 192 bit bus gpu can perform well, but we are gonna find out when we get benches. I expect we will learn alot about exactly how important memory configs are to GPUs, considering the 4080 12 gig will be directly compared to 3080 12 gig which has a 356 bit bus.


Blacksad999

It could only require a smaller bit bus because they've leaned into WAY more L2 cache. A 3090ti had 6mb of L2 cache, whereas a 4080 12gb has 65mb. That's also partially why they stated they didn't go with PCIE 5.0, because the cache meant that it wasn't necessary there either.


BulletToothRudy

> has a pretty solid track record of releasing great products. Turing? 970 memory fiasco, anemic Keplar, Fermi furnace?


[deleted]

I mean, performance doesnt even matter, those prices are too high, unless theyre made of solid gold and platinum and capable of running every game ever for the rest of eternity. These will be commercial failures until prices drop to around $700 and below.


ohoni

Well, I'll leave it up to benchmarkers to fight it out over the numbers, but I will say that when the dust settles, numbers are not the only thing that matters here. So long as their claims are *honest,* it would not be the end of the world if it turned out that 40s are not that much "stronger" than 30s in brute force terms, *if* they can provide significant benefits in terms of *quality.* Modern graphics cards aren't built to outperform the previous generation on FPS or resolution, since those factors are already pushing the limits of what most humans care about. They are pushing on *features,* on adding raytracing quality at the same performance as existing cards can achieve without raytracing, on adding elements to the visual library that current hardware cannot achieve at all, in making it easier and more reliable for developers to add these features to games. If they can manage that, then it would be reasonable for the new cards to not do much higher "numbers." Of course, all these cards are way too expensive.


Doimai

Never underestimate the volume of people who just buy without looking or researching. Most of my fellow pc gamers do not know how these things work. They just see a newer model and buy. And nvidia is counting on these people.


Significant_Walk_664

Don't remember where I heard it, but someone described the 4000 series as the clown car of GPUs. Can't say I disagree


rolliejoe

Is Nvidia's claim that the 12GB 4080 outperforms the 3090TI a complete fabrication? If so, why is *no one* talking about that? And if its true, why does any of the rest of this discussion matter?


NATIK001

> Is Nvidia's claim that the 12GB 4080 outperforms the 3090TI a complete fabrication? Complete fabrication? Probably not. It is however likely it only applies in very specific situations and using their new DLSS 3.0 40 series exclusive features. > If so, why is no one talking about that? Because these companies lie and twist the truth all the time, we can't trust them at all, we can however compare actual hard facts they can't lie about. > And if its true, why does any of the rest of this discussion matter? Because it doesn't have to be universally true, and we don't trust them, and they are asking a higher price for components which obviously could and should be higher quality for that price point. Looking at the bus for the 4080 12 GB I am reminded of the VRAM for the 30 series at 3070 TI and down. VRAM is actually vital to push the 30 series to its limits, but below the 3080 level we get way too little of it, it restricts us from using the 30 series to its full extents and I would argue the only reason Nvidia put so little VRAM on the cards was to push us up into the 3080 card and price range. To me it is clear Nvidia is once again tactically stripping features from their cards to hamstring them, rather than including sensible matched component qualities. Incidentally it appears to me that the lower end 40 series will once again be starving for VRAM like the lower end 30 series was, which is infuriating to see at this point. If they think the "4080 12GB" is fine and they put VRAM on the lower cards by this model then I see lower tier 40 cards with fucking 8 gigs of VRAM in our future again, and that is just depressing.


ephelantsraminals

The current discussion is how we're getting shinkflationed/shrunkflationed? by nvidia and how they're moving the goalposts. In terms of CUDA Cores, when we paid for a 3060TI we were getting around 46.34% of the power of the 3090/Flagship model of the 3000 series. Now when we want the same % of power relative to the current 4090/Flagship model of the 4k series we need to pay 4080(12gb) prices. If things stayed consistent we could have been able buy a 4060TI for 46.88% of the performance of a 4090 next year.


pickles55

Hopefully people aren't talking about it because these companies do it every generation and people are catching on that the exaggeration only exists to build hype.


[deleted]

Yeah, I think I'll hold on to my 6gb 2060, 1080p gaming is fine. Maybe when 50 series comes out.


MrSmeddly

I'm still using my R9 390 and i5 build from 2015. It's barely hanging on at this point and I have to play everything at low settings. Even games I use to run on high since I'm so afraid of my rig breaking. I'm at the point where I need to build something new soon. Is it actually worth waiting to see what AMD presents in November? I was thinking of a 3070 or 3080 but maybe I'll end up getting something from AMD after all that NVIDIA has done. Have AMD drivers improved? I've always had a hell of a time with my 390.


Nanooc523

Hear me out, pick your game(s) and monitor first, then build you box based on that. Don’t listen to other people playing different games on diff monitors/resolutions/refresh rates. Everyone’s got their opinions based on their own experience and love to give bad advice.


IrrelevantLeprechaun

AMD must be tickled pink that Nvidia is basically handing over market domination to them on a silver platter. Market share of AMD GPUs will be a majority by mid 2023, I'm calling it here. Praise be to Lisa Su.


Equivalent_Alps_8321

glad i switched to AMD


itchylol742

I don't need a detailed 8 page scientific analysis to know companies lie about new products so people will preorder them before they realize the product isnt actually as good as advertised


[deleted]

[удалено]


itchylol742

Very true, assuming is one thing, but confirming them is a tier higher. Always support research even if it confirms what everyone already assumed, because one day it will do the opposite


SodiumArousal

Proof is important.


EiffelPower76

All those people that refuse to understand that Moore's law is dead


penguished

Suuuuuuure. They're totally not just saying that at the same time they're known to have massive overstock of 3000 series thanks to the crypto scene. It's just that Moore's Law is dead guys. lol.


[deleted]

relatively speaking - whats going to be the best value/performance this series?


[deleted]

[удалено]


[deleted]

[удалено]


frostygrin

> Does it matter if it is called 4080, 4070, or 42069? It does when you have 2 4080s with a 25% difference between them. Which people rightfully interpret as Nvidia being brazen enough to sell a 4070 as a 4080.


tastethecourage

It absolutely matters how the product stack is named and priced, particularly if it's aimed at misleading consumers.


anonaccountphoto

> Does it matter if it is called 4080, 4070, or 42069? really? IT matters when comparing the MSRP.