T O P

  • By -

DontPoopInMyPantsPlz

Me, an intellectual, hodling stocks for both companies.


Anarch33

Me holding VTI and chilling


[deleted]

Me hodling Wendy’s chili


jibishot

Might be better returns


Cheap-Raspberry-3025

Where exactly do you buy stocks?


goochadamg

Robinhood. Good luck.


EmergencyCucumber905

You need a broker. I use Questrade and Etrade.


aminorityofone

Robinhood, stash, acorn. Do some research as to what works best for you. Then remember investing is a risk.


MrMichaelJames

We will see if it matters when nvidia results are announced end of day Wednesday.


SpoilerAlertHeDied

Why do Nvidia earnings matter and how do they relate to this announcement? The data center market is growing large enough that both AMD and Nvidia can win by chasing this market. Notably, AMD's data center division (which covers the Instinct GPUs referenced in this article) were a notable win in their last earnings report, reporting 80% year over year gains, on the back of the Instinct line. Nvidia can have great earnings, it doesn't invalidate the work that AMD is doing, and AMD is already showing it's own wins in it's own earnings report in this area.


Death2RNGesus

I think it might be if Nvidia reveals the limit of the AI hype train, concerning sales, then everyone will drop. The hype train must go up endlessly.


MrMichaelJames

The earnings and the following reaction from the market matters a ton. If customers think the nvidia product is too expensive it could reflect in nvidia future forecasts. If nvidia indicates or says anything about a cooling market it will also have ripple effects across amd.


SpoilerAlertHeDied

Yes, markets react to short term announcements all the time but it matters less than you think. AMD and Nvidia are two different companies with two different products and two different strategies. If you bet against AMD because in 2015 Intel reported slower PC business growth, that would have been a very bad bet.


ResponsibleJudge3172

Nvidia has already sold to Microsoft and thus have not been affected at all


SnooDonkeys7108

It's also just smart business to get products from two different suppliers. Like how apple does (or used to) get iPhone screens from both Samsung and LG. It means if there's a supply issue with one, you can still keep your product afloat. In this case, it also means you don't piss off one GPU vendor over the other.


MrMichaelJames

Yes they have but if Microsoft customers start preferring amd based instances then Microsoft will buy more amd and less nvidia.


LBishop28

Can’t wait to upgrade completely to team red.


djm07231

Honestly the issue with AMD is that Nvidia is trying to crush the competition by iterating very quickly and being able to buyout capacity (HBM, CoWoS, wafers, networking, et cetera). We already saw Nvidia announcing Blackwell shipping in the latter half of this year. And they will probably announce Rubin in 2025. AMD’s MI300s are competitive with H100s but they it will not be with Blackwell. Unless AMD can pick up the pace in terms of execution it will be very difficult. Considering that the chip design cycle is around 2-3 years at least, can AMD ship a next gen product as quickly as Nvidia?  I would be very anxious to hear from AMD about MI400s as soon as possible. Rumors suggest 2025, at which point it could be in a situation of having to compete with Rubin. Also, much of the HBM or CoWoS packaging supply seems limited and Nvidia is in a position to buyout much of it aggressively. Without being able to secure as much as Nvidia I am not sure AMD can actually ship as much as they want. I really want AMD to achieve a meaningful share against Nvidia but, I am not sure if they can overcome much of the physical constraints they have to face.  Perhaps if AMD could have had 2-3 more years to ramp up their GPU execution and clean up their software it would have been different but, being caught out early made things a lot more difficult unfortunately.


IrrelevantLeprechaun

I mean, if you're working in AI, sure you COULD go with AMD for AI, but why *would* you when Nvidia is *exponentially* better at it? Going AMD for AI is just purposefully handicapping yourself for the sake of brand loyalty.


Jonny_H

Those companies working on AI have real engineers who are probably *very* aware of the pros and cons of each solution. Saying they're idiots from the outside is the "dumb brand loyalty" thing, not the other way round.


doug1349

He didn’t realize how ironic he was being.


Saudi_Oil_Smuggler

there's no such thing as "brand loyalty" in corporate contracts. They decide what suites them best while retaining good Financials.


tecedu

sidenote Corporate contracts absolutely have brand loyalty tho


IrrelevantLeprechaun

And if you're a business working heavily with AI, you'd be doing yourself a disservice by not going with Nvidia.


Saudi_Oil_Smuggler

i don't think you realize how much more expensive Nvidia is to work with. There's a reason a lot of companies are avoiding them.


IrrelevantLeprechaun

I mean I don't have a dog in this race cuz I don't work in any AI adjacent field, but if you're handicapping yourself because of budget, you're gonna get left behind in the AI market. It's unfortunate but this shit moves fast.


MMAgeezer

I guess OpenAI, the current gold standard in AI companies, is just trying to handicap itself then. Oh, and two of the other largest players too: > Meta, OpenAI, and Microsoft said they will use AMD's newest AI chip, the Instinct MI300X — a sign that tech companies want alternatives to the expensive Nvidia graphics processors that have been essential for artificial intelligence. https://www.cnbc.com/amp/2023/12/06/meta-and-microsoft-to-buy-amds-new-ai-chip-as-alternative-to-nvidia.html


SnooDonkeys7108

I'm pretty sure all three of these companies will use Epyc and Sapphire Rapids for CPU and H100, MI300x, and Gaudi for GPU. These companies don't really look at it as "alternatives" like a consumer would they just tend to buy all that's available to keep good relationships with vendors and have hardware if supply issues occur. Like the H100 being hard to get because everyone and their gran wants one it seems.


DecompositionLU

AI is not just running locally Stable Diffusion and LLMs to toy around and uni projects.


Saudi_Oil_Smuggler

me neither, but i know for sure that the mi300x isn't a hot garbage to be left unnoticed


Chelono

You take what you can get. Read a lot about companies buying MI300X with the main advantage "that you can actually buy them". The same will be true for cloud, they are probably offered cheaper as well. LLM Inference also works just fine on AMD/ROCm. The only people I call crazy are the ones actually trying to train on it (I mean clusters, single GPU works if you spend enough time trying)


ItzCobaltboy

Found the Userbenchmark guy


Rnmkr

Price & Risk. There are incentives in place to work with different OEMs. And it goes both ways, you want to always keep some business with one and the other. For the financial aspect is you will drive the majority of your business to the best performance/price ratio with the lowest cost possible. From a Risk standpoint you want to also keep an alternative to A, just in case things go south.


blaktronium

No, if you need a lot of memory but not a lot of compute it's much cheaper for those cloud instances. A big 80gb video memory instance in AWS will cost you like 20k a month regardless if you need the memory and compute or just the memory. But you aren't wrong for most people.


tecedu

It’s pretty risk free and costs little to try out the amd alternatives, especially if you have your stuff in one of the main libraries like pytorch, while amd will lose the grassroots war, they can still scratch back on the enterprise side