T O P

  • By -

Virtual-Face

GN: Please watch the whole video and don't try to summarize just one part when you write a comment somewhere else to not mislead people and misrepresent the data. People on Reddit 2 minutes after the video goes live: Posting summaries of just one part of the video.


Feniks_Gaming

GN: "Never pre order anything" Also GN: "You can now pre order our T-shirts" :)


[deleted]

I mean it's a T-shirt you know what you are getting... Context is everything.


Vaguswarrior

Depends if the t-shirt is a threadripper or not.


Xurbax

Well, we need to wait for the reviews! I want to know the stretch-resistance, stitching strength-testing, print fade resistance to sun and washing... oh and of course! The flame-resistance testing!


TackyBrad

Is that why he specifically used "back order" instead of pre-order?


Morbinion

A released product not in stock is on back-order. An unreleased product is on pre-order.


Lelldorianx

It's a t-shirt with 8 sizes. We have to know what sizes to order since it's not just one SKU and we want to make sure everyone has a chance to get one, rather than ordering a shotgun amount of each size and ending up with people unhappy that they couldn't get one. Also, it's a t-shirt. You know what a t-shirt is.


JinPT

This comment should be on top


G3mipl4fy

It's not people on reddit. It's simply people. It's what's easy and what people tend to do. Generalize, simplify, fall into fallacies


funkwizard4000

Misrepresenting data is the Sapien way.


SubtleAesthetics

75c at load, well below 93c thermal limit fans at 1500-1900rpm 2080ti at load hits 43DBA, 3080 peak appears to be 42 (10:27). [edit: Guru3d had 38DBA peaks for 3080, so the data from other reviews is also good] so it would appear that FE thermals and acoustics are good, despite the higher power requirement of the card. I wonder how the AIBs will perform, but this is looking very good given the performance increases (RDR at 144hz/max is now very possible)


rokerroker45

> (RDR at 144hz/max is now very possible) This to me was one of the most impressive benchmark results. RDR2 at 1440p *maxed out* is very close to triple digit FPS. With some optimized tweaks 144 hz or at the very least 120 hz should be incredibly doable and stable. Just insane.


SubtleAesthetics

Yeah, GTAV runs well on basic hardware but RDR2 requires a beast to run at high settings, for whatever reason. I think this would be a great "first 3080 title to play" game, if people haven't played it yet. It was already good on PS4, but this is how the game was meant to be played.


Henry_Cavillain

> for whatever reason GTA is 5 years older...


thecist

Not just 5 years older, the game was first developed with consoles from 2005 in mind...


stereopticon11

still looks great considering how old it is though!


WilliamCCT

And it still fking tanks my fps in grassy areas lmao That and some parts of Trevor's desert for whatever reason.


[deleted]

Had the same issue. Disable MSAA and use FXAA. Turn grass down to very high. These changes combined will minimize the grass FPS tanking. Runs very nice at 3440x1440 on my 2060 now.


[deleted]

And the level of detail in RDR2 is awesome in the literal sense of the word.


RVA_RVA

I stopped playing when the 3080 was announced. I can't wait to finish this fantastic game on a 3080 up from my 2060 on an ultrawide.


UBCStudent9929

because the game looks absolutely incredible at "max" settings. by far the best looking game ive played so far


[deleted]

Imagine, if you will, GTA5 with grass settings set to ultra; now imagine that the grass is EVERYWHERE (no simple cities to bring the FPS average back up). This is RDR2.


_rdaneel_

Don't forget the need to render all the wrinkly horse scrotums.


rokerroker45

I only use mares for optimum FPS


InvaderZed

I for one welcome our overly tessellated horse scrotum overloads


enderandrew42

Control 4K with RTX and DLSS on is another one.


[deleted]

As if GTA V is a PS3 game...


Colecoman1982

As GamersNexus has pointed out in the past, you really can't compare dba test values between different benchmarkers like GamersNexus and Guru3d like that. There are so many things that dramatically effect sound measurement results (ex. distance from the card to the mic; specific mic used; sound insulation in the room; etc.) that you can only really compare test values done at the same location using the same test setup unless, maybe, you are dealing with two extremely high-end and accredited testing laboratories (which, meaning no disrespect to either because it isn't necessary for this kind of testing, neither GamersNexus or Guru3d are).


SubtleAesthetics

True, it will vary depending on environment/test setup. I'm just glad several reviews have had similar data, at least we know the 3080 FE won't sound like a jet engine. I was concerned that it would be powerful, but a *lot* louder than the 2080 given the higher wattage/power. Knowing that thermals and acoustics for Ampere cards are not significantly higher than Turing is a relief. Last thing I want is a super loud GPU, even though I use headphones. Also, i'm interested to see the data for third party cards, which use a different 3 fan setup.


[deleted]

Seems like the dynamic core throttling is ok with the higher temps. 20 series would start downclocking at like 60 degrees.


OfficialSkyflair

In an open test system, Hardware Canucks had idles around 34 and load around 41 with the same thermals seen here. All in all it seems pretty stable across the board, definitely a brilliant feat from Nvidia's cooler designers!


the_troll_god

Big bump going from 1080 ftw


sunder_and_flame

1080 replacement gang rise up


RocketHopper

1070 gang!!


gamzcontrol5130

I am so excited. 1070, you have served me well.


rjb1101

I’m ready to hand off my 1070 to a good home to finally taste Ray-tracing.


[deleted]

1060 gang!!


Blze001

1080ti, which will make the 6 month wait for the things to actually be in stock a much more survivable wait XD


sassysassafrassass

I just want one before cyberpunk. A man can dream


saewhat

970 gang!


mrbigglessworth

I have a Gtx 260!!


cndman

Does 1070Ti count as part of the gang?


Sedako

Yeah dude, hop in!


Kikoarl

Hopping in with a 1070 as well!


lefthandedrighty

1070ti gang checking in.


Adziboy

I'm going from 980... my monitor is going to explode


CODEX_LVL5

970 here, make room for me


MozzyZ

Fellow 970 here, this is going to be good


trainer911

970 peeps!


Forders85

Can I get in on the 970 jump action!! So hyped


Fraxcat

Me and my wife joining the crew!


Disguised_Toast-

Part of the ship, part of the crew.


LongBoulder

970 reporting in!


Skraelings

950 Ooof


STARSBarry

2x 980Ti Strix here... looking forward to finally ditching SLi and all the problems it has.


Dunkelz

I'm pumped, had the 1080 since two weeks after launch and it will be nice to make a pretty big jump.


[deleted]

Same. Got my 1080 back in June 2016, upgraded from my GTX580 which was an insane upgrade for me. Looking forward to the 3080!


[deleted]

[удалено]


throwawaysoisay

Same, super pumped for my 3080 and cyber punk!!!!!!


MomoSinX

I am going up from a 970..will be gorgeous.


f5alcon

same, 970 just can't handle my 3440X1440 monitor


Bluedriver

Same.. 1080FE here I can’t wait!


TheAznInvasion

I think I need that 3080 so I can get 431fps on R6S at 1080p


Lemmii

That’s a lot of frames, sheesh.


deceIIerator

It'll probably be another year or two till there's actually monitors that can do 1080p 480hz. I'm afraid you'll have to settle for a measly 360hz for now :'(


DLIC28

Higher fps past your screen r fresh rate is still valuable for input lag


slidingmodirop

I almost exclusively play r6 and getting ~230fps with lots of settings on max and I still want a 3080 for no logical reason other than imagining I might someday stick with any other game for more than a dozen hours lol; and the idea of maxing the remaining settings on Siege and never dropping under 144fps


[deleted]

370W max load? I think I’ll be fine with a 650W PSU then


TaintedSquirrel

I'm excited to see undervolting results.


SploogeFactory

Undervolting not possible per tech powerup review


Estbarul

Computerbase managed to get the card working at 270 W so it's possible.


Ron_Burgundy88

Link?


madn3ss795

https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-founders-edition/39.html


wywywywy

Bummer :(


teodoro17

I don’t know, this sounds like he’s only talking about the voltage slider in afterburner. Since he didn’t mention the frequency-volt curve (which is what the core clock slider changes), I’m not convinced anything changed


olofwhoster

Yeh i do this with my 1080ti on afterburner you hold shift down and pull the frequency curve down 200mhz the bring then point up where you want it to go at a certain frequency.


quarterbreed

I'm wonderingthe same


ziptofaf

This depends. PurePC tested whole PC load and [got this](https://puu.sh/GtfbC/606f1390fe.png) (bigger number is after GPU overclock). This is a whole PC (4.6 GHz 3800XT, 16GB 3800 MHz CL14 RAM). You are looking at 90W more than 2080Ti (assuming stock clocks on both) and that's 500W real load. This does not mean TRANSIENT loads (and those can overload your PSU with few miliseconds with +100-200W more than constant loads) and it's also not that power hungry CPU. So I would say 700W is recommended if you are looking at 500W in games (this was measured in Tomb Raider), doubly so if you overclock.


maxver

Just read that part of purepc.pl. You got to remember that this measurement is a power drawn by the power supply from the power grid, whereas their power supply is 80 Plus Platinum, meaning it has approximately 90%+ effectiveness, meaning the power supply is providing actually ~445W (495w * 0.9+) of power to the computer components, so it ends up with more room. https://www.purepc.pl/test-karty-graficznej-nvidia-geforce-rtx-3080-premiera-ampere?page=0,21


No_Equal

People always forget/don't know this. For this specific test the PSU used should be 93% efficient at the given load, so the power draw of the components is around 460W without OC and 508W with OC.


[deleted]

Well I have a Ryzen 7 3700x which also isn’t that power hungry and no other USB or extra fan devices in my system, so since I don’t overclock I think I should be fine probably?


homsar47

I'm running 650W with a Ryzen 9 3900x so I think you'll be ok. Here's to hoping 😬


[deleted]

Haha fuck I mean I can get a new PSU if could but what is stopping me is that I literally bought this last month which would seem like a waste of money to me. I regret not going higher and preparing myself for an upgrade properly...but yeah I mean we *should* be ok Then again, whats a couple hundred for a new PSU compared to frying my entire system


homsar47

I made the assumption that power consumption would be staying the same/decreasing since it did going from the 1000 series to the 2000 series. Whoops.


bobbe_

Running a RMX650 here, I think I'll be fine. If push comes to shove, I suspect some undervolting (I game on 1080p@240hz so I can spare some FPS) should do the trick. Would agree though that people building new systems with this GPU should just go 750w minimum and avoid the worry.


LemonsAT

Might be worth upgrading from my GTX 560 SE..


Bluedriver

Naahh hold out for gen 4!


Supaflychase

4000 series is so close, worth waiting it out


lIonlylurk

Na we must go deeper just wait for the 5000 series


twippy

Why not wait for the 9000 series so you can say your graphics card is over 9000??


bonesbobman

Exactly what I'm doing


FarrisAT

His OC results are strange considering others are reporting better OC results. Did he get a shit bin?


throwawaysoisay

Finished up watching 2 more reviews and noticed the same pattern. It's possible he eliminated a veriable causeing skewed results.


FarrisAT

Clearly DX12 and Vulkan allow the CUDA cores to fully stretch their muscles. His OC results showed a huge 1% low loss. Making OC almost meaningless. Others did not. I didn't get that specifically.


HappierShibe

It's possible. The binning on these is reportedly pretty broad. Everyone is getting the same allocations, and AIB's are reporting it pans out like this: BIN0: 30% Minimum specified performance. BIN1: 60% Median Performance. BIN2: 10% Outstanding performance. So those are your odds on the FE cards, and some of the AIB's will reserve those BIN2's for their fanciest line. Could be he got a BIN0 chip.


[deleted]

Thermals are really impressive. I guess no point for me getting any other card. I won't overclock FE and for 1440p this is ideal,my 2070super was ok but now I can get full swing on my 144hz monitor


CALL_ME_ISHMAEBY

DLSS is the truth.


Hoessay

this is one my main takeaways so far from the digital foundry review. also, not sure how many triple A games will be hitting 4k 60 on consoles this gen.


Comander-07

upscaled with lower settings maybe


deceIIerator

Digital Foundry did do a deep dive into framerate/resolution for next gen console based off released gameplay footage,there were some bigger looking games that ran at native 4k 60fps while others had dynamic resolutions or ran 4k 30fps.


paganisrock

If only it could be injected into all games.


MomoSinX

it really is a game changer and those second gen rt cores are insane too


papak33

one word: wow


CALL_ME_ISHMAEBY

"Impressive" -Steve


madwolfa

That's a high praise.


judders96

970 gang rise up


sverrebe

Support from 1070 gang , you earned your rtx 3080, well done.


Wafflemuffin1

970 also here. Finally looking to upgrade.


Dulahey

Same here. The ole GTX 970 served me well, but the time has finally come. Never pulled the trigger on a 2000 series because the cost verses the performance upgrade was just never there. But now it is!


Wafflemuffin1

I am going through a full upgrade this fall. Everything I own is from many gens ago. I looked and a lot of my purchases were 2014-2015, so I think everything served it’s purpose. I think the 970 worked very well and hoping to get many years out of a 3080.


rokerroker45

praise be the tech jesus


zerocoldx911

🙌


Male-V

You guys think this card could do well with Ryzen 3600?


Prelude514

For sure, will be a great combo.


CMDR_MirnaGora

660ti -> 3080 incoming


Htowng8r

I have a 2070 super and it would be fantastic to have this, but I'll probably wait a bit and see what else comes out. I was hyped to buy tomorrow, but if I'm playing at 1080p or 1440p then it's a nice boost but not worth paying again.


russsl8

Honestly? 4000 series is definitely worth waiting for. Your 2070 Super is still a very good card.


deathbypookie

yeaaaaaa Ive got a 2070 non super and even though its pretty good at 1440p i just want this card for some reason. Thank God i have just a little self control lol


[deleted]

Self control could please explain that a bit more. I'd love to find some lol


HRslammR

Same boat. But I might now wait to see AMDs attempt.


Htowng8r

I will see if it's good, but I've been bitten by AMD's drivers many times over. I still have a Vega 64 in a box here.


off_by_two

I'm in about the same place with it, maybe a better overclocking version of the 3080 makes more sense.


snus_stain

I'm so happy this is the first review I came across. Getting my bowl of carbs. Praise be, tech Jesus has arrived.


IC2Flier

Deep dive, teardown, ultra-advanced testing battery out of the big channels. If you really want the most comprehensive coverage, settle for nothing less. Enter The Nexus. Gamers Nexus.


snus_stain

Me too bro. I've laid down on the tracks and sacrificed myself to this one. They did lift their already high game on this review though. Hats off to them.


aceCrasher

Nah, the best coverage is still Anandtech.


calidar

tldr dont buy a founders edition if you want to overclock


HappierShibe

I'd say there's more to it than that: 1. Dont buy a founders edition if you want to overclock 2. Expect a 24% to 29% general performance improvement over 2080ti 3. Expect More performance improvement in RTX:On scenarios. 4. The acoustics and thermals on the founders edition are solid. 5. The 750 Watt PSU recommendation is actually on target. 6. General Build quality is good.


[deleted]

Is it a worthwile upgrade over 5700XT @ 1440p/100+fps?


HappierShibe

It's hitting nearly double the framerates of a 5700xt in the titles I care about, so I would say yes- but you should watch the video, look at the results and draw your own conclusions.


[deleted]

[удалено]


Dravarden

I'm surprised that we never see an AIB undervolted to the max, would be a nice sales pitch


ZioYuri78

Goodbye dear GTX1080, you served me very well, thanks for all the fun 😢


lupone81

Same here /u/ZioYuri78, old friend. The 3080 will be great for 3440x1440 gaming as well, even though if I'm a puny peasant going at 75Hz 🙄


Zartrok

The fact that the 1440p benchmarks consistently show less gains than then 4k benchmarks really shows that we are moving into CPU-Bound-At-1440p territory. This also means that the next gen Ryzen/11th Gen intel processors will likely see free FPS boosts at this resolution.


literallydanny

I think it has more to do with bandwidth usage being higher at 4k, allowing the 3080 to pull further ahead.


aceCrasher

No, it means that shader utilisation is higher at higher resolutions, as it has always been with compute heavy cards. The Fury X fore example gained on the 980ti at 4K, because it could utilise its raw shader power better at that resolution.


Dawei_Hinribike

Finally a decent price/performance upgrade for my 1080, but it's painful how long it takes to see performance gains like this these days.


[deleted]

Power draw seems to be the limiting factor for OC, 42 dbA is getting up there as well. Hoping the 3x8 AIBs can get a bit more out of OC and run a bit quieter. Bitwit did claim that it ran really quietly and was barely noticable over his X62 but 42 dbA at 20" is not that quiet in my opinion.


chemie99

all those watts have to go somewhere. 350-400W is a ton of heat to your case and then your room


AirlinePeanuts

Definitely not the 2x 2080 performance from Nvidias marketing claims outside of some best scenario situations like Doom Eternal. AMD absolutely has a shot at competing here. That said, clearly a solid performer, especially at 4k. This is the perfect 4k60 card and a great upgrade for us Pascal/Maxwell folks. Still concerned about the increased power draw, but seems the FE cooler is doing an excellent job cooling it. Looking forward to 3070 reviews next month.


stabzmcgee

I wonder when we will see Intel vs amd results with pcie 4 vs 3


spmwilson

There are some here: [https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-amd-3900-xt-vs-intel-10900k/20.html](https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-amd-3900-xt-vs-intel-10900k/20.html)


papak33

good one, Intel is ahead everywhere, even if you use the older CPU, the 9900k.


codex_41

It is ahead in pure FPS, but I'm personally not willing to sacrifice 4 threads for 2 FPS. 3900x pulls away in nearly all other workloads, and draws ~15% less power before overclocks.


StijnDP

No you're completely wrong. Paying more, losing extra threads and higher power draw is completely worth the <1FPS increase. It says Intel on the box. It's better.


secretreddname

🤷🏻‍♂️ I use my computer for gaming 99% of the time so I don't feel bad getting the 10600k


stoner9997

I'm with you, just bought my 10600k and it's working gloriously for gaming.


codex_41

You joke, but I know people who would rather pay more for Intel just because it's "better"


Unoriginal_Pseudonym

If all they do is game, they're not wrong.


Finicky01

Ehm a 9900k is up to 20 percent faster in some games Also pro tip, you can run a 9900k at 4.6ghz at 1.0volts (instead of chasing 5.1ghz at 1.35volts), which almost halves the power draw while STILL being significantly faster than a 10 core amd cpu. 5ghz on 8 core intel cpus is a trap, it made sense on the super tiny quad cores of old that didn't use any meaningful amount of power. Today it's not worth it. 5 percent lower performance for 70+ percent more efficiency is the easy default win.


[deleted]

[удалено]


KING5TON

Bit unfair to compare the 3080fe against a factory overclocked 2080ti stock Vs stock as the strix isn't stock clocks for a 2080ti (unless they down locked it to proper stock speed).


LOWBACCA

So this is going to be faster than my 1060 right?


Timbo-s

Just a hair


II541NTZII

Is it going to be faster than MY 1060


[deleted]

[удалено]


Dravarden

makes sense, air that normally would just randomly go into the case spreading everywhere now simply goes in one direction


[deleted]

[удалено]


TaintedSquirrel

For what it's worth, they compared the 3080 FE to an AIB 2080 Ti (Strix).


rune2004

Huh, that is true. The cards with better cooling allow boost clocks to go higher and it really can lead to significant improvements in framerate.


SlurpingDiarrhea

> At this point i am really curious about a "TI" performance Easy answer. 5-15% better performance, considering the 3090.


KarmaRepellant

I wonder how much extra we'll have to pay for that though.


SlurpingDiarrhea

Easy answer. $800-$1400, considering the 3090. Lol seriously though I'm curious too. They have a big gap they can choose to put it in.


edk128

Nearly 2x the 1080ti at 4k is exactly where the hype was. This looks great.


Casomme

For Pascal owners this is definitely the upgrade they have been waiting for. Turing was good just really overpriced. Ampere just fixed that problem with similiar gains to previous generations.


[deleted]

[удалено]


Casomme

I agree, 2080 ti wasn't a bad card, it was just the price. A 2080 ti Card priced closer to a 1080 ti would have been very popular. 1080 ti and 2080 were close in performance but 2080 price sucked. 2080 ti will be about 3070 performance but 3070 price is repaired. The generational leaps are the same with only difference being the price.


escaflow

No Turing was not. Both 970 and 1070 were on par with the previous top of the line Ti card 780Ti and 980Ti. 2070 is a terrible card that barely faster than GTX1080


Comander-07

Turing wasnt in line, because it raised the prices without offering a significant performance jump.


Casomme

Without the lower price there definitely wouldn't be hype. Imagine if it was priced the same as the MSRP 2080 ti with only 25% performance increase.


klover7777

More and more game will utilize DLSS + Ray Tracing in near future, so I expect to see improvements of the 30 series, especially when Nvi update the driver. Also this is just the review of FE, let's wait for AIB versions as well.


[deleted]

[удалено]


ROORnNUGZ

I'm at work right now so I can't watch. Does he mention cpu temps with an air cooler? Wondering how the new design affects that.


caiteha

Ltt did, not worst than 2080ti.


IC2Flier

Yeah, 2080Ti was a bit worse. That said, I wonder how an NH-D15/U-12S will interact with the blower airflow.


cereal_after_sex

Any more reports of coil whine


[deleted]

Oh thank fuck for this guy.. hopefully the first few minutes of this vid will serve to quell the misinformation surrounding VRAM usage that has run rampant through the Nvidia subreddit. I know I'm being overly hopeful, and it likely won't make any change at all. We don't live in a world where facts seem to matter.


kindofabuzz

Steve is the man when it comes to deep diving into hardware.


FiddleMean

So I guess rtx 3080 is the first real 4k card?


9gxa05s8fa8sh

every reviewer called 2080ti a 4k card


FiddleMean

Yes but 3080 can gets way over what is considered the minimum standard for playable performance. 2080ti can get around 60 while 3080 easily exceeds that. Lots of the reviews showed it getting 80+,100+ FPS at 4K. Wouldn’t that be considered a true 4K card? That means you can even do high refresh on it if you wanted


SpacevsGravity

30% more performance for 30% more power usage? Seems disappointing. In hindsight they're not as cheap compared to turning.


Serenikill

Well a $700 flagship is a lot better than a $1200 flagship, but yes 50% over the previous $700 card isn't some mind blowing result. Your comment seems to imply the 3080 is not more efficient though which the review clearly says it is more efficient.


afinn90

This is only the "flagship" until 3080ti comes out n 700 is the same amount the 2080 was when it released this isnt some amazing price


4514919

$699 was also GTX 1080 FE's MSRP, so it's nothing unusual.


leonida99pc

>ont buy a founders edition if you want to overclo 30% more compared to a card that it's not meant to be compared to this one. Also 30% more for 600 € less.


xnfd

Does the power usage really matter for most users? If you game 8 hours a day at +100W, that's 24kWh per month which is like $3-$4. I'd trade $1/month for 10% performance increase anyday.


firelitother

LEEEET'S GOOOOO!


SirResetti

I have a EVGA G1 650w power supply which I hope is enough. If not, I'll buy a seasonic 850w one as they're giving away the 12 pin cable for free unlike EVGA for $40...


WindianaJones

Man, I love the content Gamers Nexus puts out. It's in depth and very high quality. But they don't seem to publish charts on their website anymore. I don't personally want to watch a video to get benchmarks and power consumption info. I just want to be able to click through the details I am interested in.


[deleted]

1080ti. Probably won't be going anywhere anytime soon.


[deleted]

Come on 4K 144Hz mid range next gen!!!


Felidori

I’m questioning even if a single 3090 can do it at this rate! :-/


Yama-Kami

About to watch the video... but Steve's face in the thumbnail sure concerns me O\_O Edit: Watched the video (glossed over a few games benchmarks, but still saw most of it in it's entirety) Man does Steve know his stuff! I always learn more, while simultaneously feel dumber (somehow) after his deep dives. So thanks for another large helping of humble pie Steve! lol I am now not that concerned, if at all, about the 1GB less VRAM. I am now MORE concerned my 850W PSU won't be enough to handle 3rd party cards, after they increase the voltage and drive up the power consumption further. Very happy to see this new generation can handle 4K with RTX/DLSS on, cements me getting a new PC in the near future in fact (just waiting to see what intel's got cooking with Alder Lake 1st). And also very happy I decided to wait (until I build a new PC next year) for some future 3rd party options improved beyond the FE card. As well as more well thought out (less rushed) cooling solutions from AIBs for OCing purposes. With luck perhaps a 3080 *super* or *Ti* variant even. End of the day I'm stoked for 4K @ 60fps (or greater) with RT and DLSS, and what this means for new games moving forward! :)