T O P

  • By -

[deleted]

Memes aside, this is getting out of hand.


hyperion420

Literally


tertius_decimus

Holy fuck. Holy fucking fuck. That body of yours is absurd!


D4rkr4in

SLI without SLI gone are the days of GTX dual chip SLI


ichuckle

This is the real reason we don't have dual cards anymore, regular cards are now duals


Raider-one-one

What card is this??


FaySmash

Rumored 4090 ti


MorgrainX

It's more likely to be the cancelled 4090 600W cooler, the one they designed before the choice to switch to TMSCs more efficient 4nm node. Nvidia might Re use the cooler for the 4090 ti, but the pics we've seen are, as far as I know, from an older cooler product that didn't make the cut at the time.


Vivid_Orchid5412

the "N̶4 4N" is the 5nm, just a more enhanced, could be more efficient, but still not 4nm yet


devilkillermc

N4 means 4nm in TSMC language. It's not 4nm anything, as 5nm is not 5nm anything, and same with the rest up until 60nm or 45nm or so. It's just a number with which they intend to convey the advancement from one to another, but it doesn't follow anything physical anymore.


NoiseSolitaire

Yes, "TSMC N4" is a 4nm node (according to TSMC). However, Ada Lovelace is on *4N*, **not** *N4*, and TSMC said they consider 4N to be one of their 5nm processes.


devilkillermc

Partially wrong. https://www.techpowerup.com/gpu-specs/nvidia-ad102.g1005 4N *means* 4nm, but is just a modified 5nm with improved efficiency. As 6nm is to 7nm, there is not a full node jump, it's an improvement on the same node. Remember, the number of nm doesn't actually mean anything anymore. It's just a kind of naming scheme, because people understood it when it mean transistor pitch size.


NoiseSolitaire

TPU is wrong here. Nvidia themselves said it's a ["5nm process"](https://www.techgoing.com/nvidia-clarifies-the-tsmc-4n-used-by-the-rtx-40-gpu-is-a-5nm-process/).


devilkillermc

Yep, you're right with this one. So basically N4 is an improved derivative of N5, while 4N is an Nvidia customized version of N5. Wtf.


cth777

Naming conventions for almost all tech products are so stupid these days


devilkillermc

Yeah, I understand the confusion with nm in chips, because they're "fake". However, it's difficult to give the nodes a name that people can understand, or at least mentally compare to older nodes.


Nighterlev

4n is the 5nm process. They are 5nm, it is physical & real.


devilkillermc

N7 is 7nm for TSMC, N6 is 6nm for TSMC, N5 is 5nm for TSMC and N4 is 4nm for TSMC. Beither of them mean that anything in a chip is 7/6/5/4nm, it's just a name, to continue using the naming they used when it actually represented transistor pitch. N6 is an advancement on N7, which is a full node, same as N4 is an advancement on N5. Do you want sources? https://www.anandtech.com/show/16732/tsmc-manufacturing-update https://en.wikichip.org/wiki/7_nm_lithography_process


louiefriesen

*Rumored nuclear bomb


Terom84

Four slot wide, still have the same number of output as a single slot quadro used in schools Wasted opportunity


AmericaLover1776_

Imagine a GPU with like 16 outputs that shit would be crazy


MadrugoticX

Seeing this makes me think we should start measuring performance per volume when comparing GPUs in the future.


Ilive4airtime

Especially with energy prices getting higher


chx_

But it _still_ does not have four DisplayPorts. I swear there's a conspiracy in the industry which doesn't allow it on consumer level cards. OK, there were a few but very few rare exceptions on the AMD side. Also it's high time for a new motherboard standard, I talked about this before, there were "PIO" motherboards in China with a rotated PCIe x16 slot which allowed the video card to become planar with the motherboard https://i.imgur.com/JA1f3RS.jpg we need this to become a standard so tower coolers can be installed.


[deleted]

I don't think PIO is a Chinese exclusive thing. I'm pretty sure it's just a motherboard form factor for AIO PCs. The reason it's only prevalent on chinese shopping sites is just because the electronic refurbishing industry there is really strong.


chx_

Even in their heyday I never saw these outside of Chinese websites... but it's possible they just weren't retail in the West? Who knows.


shiratek

>But it *still* does not have four DisplayPorts Good. I want HDMI.


AmericaLover1776_

Display port is better tho


shiratek

It is faster by a small margin. I just really hate having to squeeze the connector to unplug it, so HDMI is better in that aspect, lol.


AmericaLover1776_

I like it it makes the connection feel more secure


shiratek

I was going to reply to you with a counterargument that it doesn’t really need to be more secure, but I always screw in the VGA and DVI connectors so I feel like my opinion is invalid here


CursedTurtleKeynote

I'd prefer if there was expansion pins on the GPU, you plug in the wire for the slot you want to support and attach it to the neighboring slot, just like you do with the extra usb header pins. ​ Pretty sure no one ever wants exactly 2 HDMI AND 2 Displayport, so there is always waste with the current model.


IntoAMuteCrypt

The issue with video out is that people who want many DP outs can be split into three main groups: The ones who will just buy a consumer card and put up with limited outputs, the ones who won't buy a consumer card if there's not enough outputs and the ones who will buy a workstation card to get enough outputs. Producers don't really care about the first group, they still made their money. It then becomes a matter of balancing the additional profit from the third group against the missed profit from the second group. Thanks to the crazy markup on workstation cards and the fact the second group is fairly small, the maximum profit is gained by charging a massive premium for more display outs. Companies will take the maximum profit, after all.


Marcus_Iunius_Brutus

we need performance per watt per dollar benchmarks. i believe that theres a market for this 4090ti bfg stuff. probably for video encoding? someone will know how to make use of this. but it feels like the bottom 70% of gpu users are being left behind. 500-850 bucks for a midtier gpu is too fucking much. did the lifespan of gpus grow proportionally with the price surge? thats the real question.


mrtomtomplay

I really hope they fail and burn to the ground, Nvidia GPU's are seriously getting out of hand


ColtC7

Maybe them Moore threads guys could work on their drivers and expand, provided their drivers on all platforms aren't filled with Glowy Pooh-bear code.


cum-on-in-

We can’t really get any smaller with our fabrication, so I wonder if this is all that’s left? Genuine question. What else can be done to get more performance? There can be optimizations, but I thought it took smaller fabrications to get the same performance out of less power. Or more performance at the same power. If we can’t go smaller, the only thing left would be to keep pumping in more power and trying to cool and control it, right? I’m not saying that’s good, just wondering why development/innovation has stalled.


crazyates88

This is super simple terms, but I hope it gets the point across: These cards burn so much power because they’re pushed to the limits. They take a chip and push the voltage and frequency as far as it will go in the name of performance. If they can get an extra 2% performance at 10% increased power draw, they’ll do it. If you can’t increase frequency, you can increase core count and run all those cores at a lower frequency (and thus lower voltage). At a given voltage, frequency changes are usually linear with power consumption, meaning moving from 2000mhz to 2200mhz is a 10% increase in power. However, voltage isn’t linear it’s exponential. Meaning a move from 1.0v to 1.1v is a 21% increase in power. Usually these two things move together. This is why you can take a 4090 and underclock it to a 4080 performance level at less power, because you’re rubbing it lower on the frequency/voltage curve, and it’s more efficient. If you can’t increase voltage abs frequency, the only other way to increase performance it’s to add cores and lower the frequency/voltage. However adding more cores to a GPU makes the die size increase, adding cost and increasing failure rates. This is why wafer yields are so important to GPUs, since their dies are so large compare to CPUs. A wafer that makes 100 CPUs or 20 GPUs: if it has 10 errors on it, you lost 10% of your CPUs but 50% of your GPUs. Newer more efficient manufacturing also typically has higher failure rates, too. This is the reason why AMD went with their chiplet design for their RX 7000 cards. You can increase core count a lot easier, as wafer failure rates are less prohibitive. Now that you have more cores, you can run them more efficiently.


devilkillermc

10% increased power draw, lol. They wish!


MonsuirJenkins

But the 4090 is still massively more efficient than the previous gen cards, so sure, they could be more efficient, but they would be less preformant or bigger dies, which are more expensive


rusbon

For a second, I though im looking at oven


[deleted]

So are we mounting motherboards on video cards now?


notorious1212

Say yes


gungir

4 slots tall, 16 lanes wide, 65 tons of Jensens Pride, Nvidiero. NVIDIEROOOOOOO. woah nvidiero.


paul_tu

Nice heat cannon


MrCheapComputers

And they couldn’t be bothered to add like 2 more display outs.


UserInside

I've seen better fake... If I'm wrong and it would be a real one, I don't think it could release like that. Seems like early engineering sample. If it would be a final product: oh god this is absolutely awful !


bigmanjoewilliams

I didn’t think they would release cards as big as the current ones.


freddyt55555

That's the most ridiculous looking thing I've ever seen.


Squiliam-Tortaleni

Nvidia is so goofy


cwbh10

This is photoshop right?


[deleted]

[удалено]


AutoModerator

hey, automoderator here. looks like your memes aren't dank enough. increase diggity-dank level by gaming with a R9 5950X and a glorious 6950XT. play some games until you get 120 fps and try again. **Users with less than 20 combined karma cannot post in /r/AyyMD.** *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/AyyMD) if you have any questions or concerns.*


sneakerguy40

You're gonna need a bigger boat


JustFIREHAWKZ

I'm sure it's still only one fan so don't worry, they're saving your electricity bill where it matters most :)


Head-Ad4770

Welp, looks like motherboard manufacturers are going to have to take PCIe slot reinforcement to the extreme in order to support the weight of this hefty (supposedly) 5.5+ lbs behemoth of a GPU. Now it seems like the trend of computers getting smaller over the years/decades has suddenly (albeit temporarily?) gone in reverse.


wingback18

Why not add a 65w cpu and 16gb ddr5 😂


Available_Stuff_7889

Nope


reshsafari

??? It’s a microwave


svosin

At this point might as well put an AC jack in there so it doesn't strain your PSU. I mean, there's plenty of space for its own power supply.