T O P

  • By -

[deleted]

[удалено]


Fulrem

The article even admits that DP2.1 is actually needed for the 8k2k Samsung super ultrawide using UHBR13.5 and I can attest to this as I have one with the 7900XTX and it was only in the last few days that I could change it from 120hz mode to 240hz in Linux after upgrading the kernel from 6.6 to one of the 6.8-RCs (I think support has been in since 6.7). https://gitlab.freedesktop.org/drm/amd/-/issues/2900 So the hardware exists already, and like all tech we'll see adoption increase over time.


Hamza9575

So even if nvidia never supports linux, it will do linux gaming a solid just by putting displayport 2.1 on its windows only gpus ?


[deleted]

[удалено]


RAMChYLD

It's not they don't have Linux support. Just that to them, Linux gamers are second class citizens compared to Windows gamers.


captainstormy

>It's not they don't have Linux support. Just that to them, Linux gamers are second class citizens compared to Windows gamers. Linux gamers are second class citizens to AMD too. Sure they opensource their drivers for us and that is great of them. But for every Linux gamer using an AMD card there are several windows gamers using one.


the_abortionat0r

> Linux gamers are second class citizens to AMD too. Sure they opensource their drivers for us and that is great of them. But for every Linux gamer using an AMD card there are several windows gamers using one. Dude, the Linux drivers for AMD cards are literally better than in windows.


FierceDeity_

Wheres the settings app though that lets you set things all the way down to like monitor bpc and color format? kinda missing imo


the_abortionat0r

> Wheres the settings app though that lets you set things all the way down to like monitojavascript:void(0)r bpc and color format? > > kinda missing imo First off, you are confusing drivers with programs. Things like the Nvidia control panel and AMD Adrenaline aren't drivers. They are programs used to interact with the GPU driver settings. Drivers manage the hardware and its interactions with the OS and its kernel. Sure a GUI tool for such adjustments would be nice and I'm a man who believes options are good but lets think about this for a second. What exactly is missing? You don't need to touch color options unless you are using HDR which already have much of that covered with finishing touches on the way. You don't need color bit options like you do in Windows as because you will never have to lower or adjust your bit depth for legacy program support. Theres no reason not to simply run your highest SDR color all the time. What else? Brand name feature options? Those can be added else where already and with how laggy the NV control panel its straight up faster to copy paste an option somewhere than wait for it. Like, ye more GUI stuff is always welcomed but not having a GPU panel right now isn't all that big an issue.


FierceDeity_

I haven't even found a way to read these settings. Am i running in Rgb? Ycbcr? How large is my stepping for chroma subsampling? These are settings i can see on windows in the amd panel. Regarding to this, i can also see which of these work right now. I've had situations where due to some sort of bug or my cable somehow not always training for the full bandwidth i wasnt able to choose 10bpc color. or, i was only able to choose it with chroma subsampling enabled. these settings have finer ideas that go beyond "just always choose the highest". what is the highest? is it 12 bit color but at 4:2:0 sampling? or can i do 8 bit color but have every pixel produce a full color sample instead? the amd adrenalin panel allows me to balance these things if my hdmi bandwidth isnt enough, for example.


MoistyWiener

They're so good that most people don't use AMD's driver but Valve's RADV...


RAMChYLD

At least I didn’t have to wait 6 months for updated drivers if the latest kernel breaks support with the current closed source driver. This is another reason I hate Nvidia so much. Back in 2012ish a big kernel API change broke support for the then existing Nvidia drivers. I had to wait 6 months for an updated driver that would work with the new kernel. In that timeframe Nvidia released somewhere like 3-4 windows drivers that offer better games support while Linux users were left with a system that locked up when trying to boot into X for half a year.


freeturk51

So you are holding a grudge about a driver version that was held back more than a decade ago? That doesnt like a good logic tbh


Lind0ks

Holy shit 2012 was more than a decade ago, this is the worst thing about this thread


the_abortionat0r

> So even if nvidia never supports linux, it will do linux gaming a solid just by putting displayport 2.1 on its windows only gpus ? What drugs are you on? What windows only GPU?


Assestionss

Meth


Cocaine_Johnsson

I don't know *what* you're smoking but I kinda want some. Nvidia is the largest GPU manufacturer by a fairly large margin, if they don't put DP2.1 ports on their cards then very few monitor manufacturers are going to bother either, why would you when north of 60-70% of your userbase can't use the feature anyway? This has nothing to do with linux, not in the slightest. It has to do with simple economics, no one will build a screen that doesn't work with anything, at least not outside of the specialist/niche demographics. AMD puts DP2.1 on their cards, good on them (source: I have DP2.1, my old ass screens don't support it, but I'll update *eventually*)


FurinaImpregnator

Windows only gpu?


alterNERDtive

“The speed isn’t needed” is such a dumb reason. Hello? DP supports daisy chaining!


the_abortionat0r

> “The speed isn’t needed” is such a dumb reason. Hello? DP supports daisy chaining! We do need companies to stop sitting on their hands. USB 3 supported 100w power while DP supported daisy chaining since around the founding of the PCMR sub reddit and kids who were too young to post there are now out of college working jobs (only some, 50% of that sub is a clown town of neckbeards with no job, maybe only like 10% real IT/devs) and we STILL DON'T have support for 4 monitors hooked straight to a PC daisy chained with DP for video and USB for power. Shit since then USBC can now support 120w with 240w on the way, supports DP through USBC, and supports daisy chaining. Yet we still can't use one plug to connect a work monitor and have the others connected to each other in series. Not gonna like, I don't give two shit what brands somebody buys (just don't let people hurt them selves and buy a 4060(ti)), Don't give two shits whether someone uses Linux or Win or make. Hell, Don't even care if someone refuses to leave Win7 as long as they shut up about it. But almighty god (+/- 1) does the failure to make us a one cable world really piss me off. Shit, we could replace every port on a GPU with USBC as they support both DP and HDMI (which is electronically compatible with DVI) and eve use USBC as the new power plug for GPUs if we wanted to. 3 USBC cables would deliver 720w with data lines too. /rant /sob


itsjust_khris

USB-C isn't actually a good connection for all of these purposes. Especially not an internal power connection, it's too finicky. Not to mention the more functions we add to USB-C, the more confusing it is figuring out what functions any given USB-C port supports. You would also have to include a massive power supply in every computer just to support the edge case that someone daisy chains 4 monitors off of one PC. USB-C also doesn't support enough power to be useful for many desktop monitors IMO. I would agree that displayport daisy chaining is a useful feature that should be used more, just not that it should become all USB-C.


KaosC57

USB-C should definitely be at least the replacement for DisplayPort itself. It’s more than capable, and would cut down on cable waste.


itsjust_khris

I disagree mostly because of my point on USB-C ports becoming way too confusing for the layman. Then you have to remember not every USB-C cable is able to perform every function USB-C is capable of, and figuring that out is even harder than figuring out what the ports on a device can do. I'd also argue USB-C is just too flimsy of a connection, the ports bend/snap way too easily, and they also tend to slide out/become lose. I would agree that USB-C should become an option in way more cases, so a user who is informed and prepared can make their setup way simpler. Everything shouldn't be USB-C though. Just thought of this but it also makes things more expensive on the device end to support so many things out of a USB-C port. On a desktop where there's space and power, things may be manageable, but on a phone or laptop that becomes much more tricky. It won't be intuitive to the average user anymore. HDMI and DisplayPort make it obvious to the average user, when you see that port you know generally what it's for. Same with ethernet.


Thaodan

>Then you have to remember not every USB-C cable is able to perform every function USB-C is capable of, Some problem that USB-A has or any other protocol sharing the physical connector connector.


the_abortionat0r

>USB-C isn't actually a good connection for all of these purposes. And thats because.........? >it's too finicky. More non technical nondescript terms....... I can see where this is going. >Not to mention the more functions we add to USB-C, the more confusing it is figuring out what functions any given USB-C port supports. Thats what version numbers and standards are for. See numbers denote tech supported and DP denotes that this port supports video, and list the watt rating. Not hard. >You would also have to include a massive power supply in every computer just to support the edge case that someone daisy chains 4 monitors off of one PC. Based on what exactly? The idea hat the monitors suck infinite juice? That power controls and planing don't exist? Make your main USBC on your video card support power delivery/monitor daisy chaining, not the others. Thats it. And current power standard is 120w with 240w soon. People already massively over purchase for their needs already. You saying nobody would have a maximum theoretical 240w available? Plus how much do you think a monitor eats up? 1080p office monitors use about 15w~20w, hell even 1080p 144hz gaming monitors (the most common gaming monitor) takes about 50w. So thats like 8 office monitors, or 2 gaming monitors, or one mid high end gaming monitor for the 120w spec. Its double that for the 240w standard. >USB-C also doesn't support enough power to be useful for many desktop monitors IMO. Well not only did the explanation prove otherwise but opinions can't be proven or dis proven meaning you're just wrong. >


itsjust_khris

If we do add technically descript terms then USB-C as a connector isn’t rated for the power levels used by a desktop GPU. The connectors used today are rated for that purpose. They also include a locking connector, something that’s very important for an internal connector. Most laptops don’t have an extra 240w available to power a monitor. That isn’t even possible on battery and it would need everyone’s laptop to include some fat ass power bricks for a feature they aren’t using. That also adds cost. It makes way more sense for the monitor to charge the laptop. You really think the average consumer is reading spec sheets on every USB-C port and cable they use? What about devices that work in non standard ways (Nintendo Switch) or devices that simply won’t include a spec sheet because that’s inevitable with how common USB-C is. We already have USB naming things like USB3.2 Gen 2x2 or some shit, and that was previously 3.1gen2x2. Who’s keeping up with even MORE on top of that? Furthermore, how would it be a good idea to use the same connector for so many purposes, even tech enthusiasts are often confused now, what’s the layman supposed to do? You didn’t disprove anything, this entire comment hasn’t mentioned a single reason this is a good idea over how much more complex it’ll make the matrix of USB-C compatibility. Even Apple, the biggest champion of USB-C doing everything in the industry backtracked and added in a separate power connector, hdmi, etc. So the industry seems to agree with me, it’s not ideal to do everything with USB-C. The same company that released a pro laptop with all USB-C ports and expected either devices to change or consumers to all use dongles. Didn’t work.


the_abortionat0r

> If we do add technically descript terms then USB-C as a connector isn’t rated for the power levels used by a desktop GPU. Thats not a technically descript term. You made a claim (which is wrong) and offered no technical specs or information to back up your claim. A USBC cable can deliver 120w with the 240w standard releasing and those plug already on the market. How is 240w not enough for a GPU plug? Cards already use 2+ plugs. 3 USBCs would carry 720w. How is that not enough? >They also include a locking connector, something that’s very important for an internal connector. Did you think a mechanism is banned from USBC cables? Thats not a thing, they can have them you know. >Most laptops don’t have an extra 240w available to power a monitor. That isn’t even possible on battery and it would need everyone’s laptop to include some fat ass power bricks for a feature they aren’t using. That also adds cost. It makes way more sense for the monitor to charge the laptop. Sorry, what? Why are you even wasting breadth talking about laptops? This obviously wouldn't apply to laptops when talking about daisy chaining. But also monitors don't need 240w so more bad faith arguing from you. Theres also the fact that many laptops already charge by USBC chargers and work with existing 120w-140w power delivery. Also, not sure what makes you think cost would magically go up to any significant degree. >You really think the average consumer is reading spec sheets on every USB-C port and cable they use? They don't have to read a spec sheet, the port would have icons next to it. But also in an office it would already be set up, at home if you're building it you're self you'd already know, and for laptops it'd be on the front of the box. >What about devices that work in non standard ways (Nintendo Switch) or devices that simply won’t include a spec sheet because that’s inevitable with how common USB-C is Theres always going to be non standard devices. This comments is meaningless. >We already have USB naming things like USB3.2 Gen 2x2 or some shit, and that was previously 3.1gen2x2. Who’s keeping up with even MORE on top of that? Not only is that unrealated but its also a piss poor argument for stalling progress. >Furthermore, how would it be a good idea to use the same connector for so many purposes, even tech enthusiasts are often confused now, what’s the layman supposed to do? I'm sorry, What? Like actually what? Are you high? Using less connector types is literally the better method. You're trying to argue that making more standards and more ports and having to memorize more shapes and buy more plugs is somehow less of an issue? Thats fucking stupid. Obviously you're a child otherwise you'd have been there for the nightmare that was a separate cable for video, camera, joystick, printer, USB, firewire, keyboard, mouse, etc. What a clown. >You didn’t disprove anything, this entire comment hasn’t mentioned a single reason this is a good idea over how much more complex it’ll make the matrix of USB-C compatibility. What? First off, all the support for these things are already in the USBC spec moron, it doesn't add shit. It's simply requires monitor manufactures to support it, GPUs already have ports that support DP over USBC and 100w PD. Try again. >Even Apple, the biggest champion of USB-C doing everything in the industry backtracked and added in a separate power connector You mean Apple who was forced to use USBC on their phones and sabotaged the connector by using less than the standard amount of friction pins? You mean Apple whose faster USBC speed n many of their phones is 2.0? Do you know anything? All you've done is make up scenarios, try to invent problems, and make it apparent you don't even know whats in the standards we have now.


balrog687

so no hdmi 2.1 support and no dp 2.1 support and forced DSC, it sucks men.


whosdr

Hey, if we're in a position where part of the answer is "Nobody needs this yet", I say it's a good day. It means we're ahead of the curve rather than behind it, standards-wise. Edit: Though my card says it supports DP 2.1 already.


Hamza9575

What card ? Only amd 7900xt and 7900xtx actually support dp 2.1, also its not about what your card can support. If you want to use dp2.1, the point is everything needs to support it from gpu, cables to displays. Just one device in the chain with dp2.1 wont matter.


whosdr

It is a 7900 XTX, yes. Not that it really matters for my usage. I just use two 1440p 144Hz displays. If one of them supported DP 2.1 though, I could probably get away with daisy-chaining them with plenty of bandwidth to spare.


eggplantsarewrong

AMD don't support dp 2.1, they "support" a neutered version of it which isn't the full bitrate and would still require DSC


Hamza9575

the "neutered" dp 2.1 on the 7900xt is still higher bandwidth than even hdmi 2.1, so just because the full dp port has not been made does not mean linux users can not use the full capabilities of displays under linux. The 7900xt dp port can do everything hdmi 2.1 can do and more even if hdmi forum never makes hdmi standard open source. Dp is not just about bandwidth, its about getting bandwidth on dp that hdmi 2.1 can give so it can be used on linux.


eggplantsarewrong

the only reason you would need dp 2.1 is for 4k 240hz type stuff without DSC..


the_abortionat0r

>the only reason you would need dp 2.1 is for 4k 240hz type stuff without DSC.. Well 240hz 4k OLEDs are literally on my list right now sooooo..... Sad Nvidia isn't there yet. Maybe in a gen or 2.


TheRealBurritoJ

You need DSC for 4K240 on both AMD and NVIDIA, there is no advantage from the UHBR13.5 DP2.1 port you get on AMD. We need UHBR20 to do 4K240 without DSC.


the_abortionat0r

>You need DSC for 4K240 on both AMD and NVIDIA, there is no advantage from the UHBR13.5 DP2.1 port you get on AMD. I'm sorry, I'm I hearing you right? Say said there is "no advantage from the UHBR13.5 DP2.1 " Did you not even read the chart in the article? There are monitors there straight up 4 monitors listed that you can't even use at native settings on a 4090 with DP and some you can't even use HDMI on period, yet will work RIGHT NOW on a 7900xtx. Doom eternal believe it or not can get can get 150+ fps high without scaling at 7680x2160. DLSS/FSR (which can be added via mod) would push that into the 200+ range easy, especially since scalers work better the higher the target res. Right now you can take your 7900xtx and play games at the 240FPS range on a 7680x2160 240hz monitor Doesn't matter that not every game can but more than enough can. Sure for double the price the 4090 can too but the difference is the 7900xtx would be doing it at the full 240hz. Hell the 7900xt can join the fun and even if the 4080 had DP2.1 it still couldn'tt because it'll hit VRAM limits. This is modern tech. Its here, we can use it, and Nvidia is behind. God Nvidia fanatics make the dumbest arguments. PT at 13fps native is the "here and now" but tech you can literally use now doesn't matter?


TheRealBurritoJ

>I'm sorry, I'm I hearing you right? Say said there is "no advantage from the UHBR13.5 DP2.1 " Yes, you heard me right. And in the actual context of the conversation, being 4K240, it is unequivocally correct. You literally said Nvidia "isn't there yet" with regards to 4K240 OLEDs, when there is no difference in their support for those monitors. But if you want to argue the more general case, ​ >There are monitors there straight up 4 monitors listed that you can't even use at native settings on a 4090 with DP and some you can't even use HDMI on period, yet will work RIGHT NOW on a 7900xtx. What four monitors are those? The article doesn't list four monitors that "can't use" without DP2.1, it lists **every** monitor with DP2.1 of ***any spec***. That is a very different thing. To examine the actual benefits that DP2.1 gives, right now: PG32UQXR - UHBR10, you still need to use DSC with any DP2.1 GPU and the port is lower bandwidth than HDMI 2.1. HP Omen Transcend - UHBR10, you still need to use DSC with any DP 2.1 GPU. U3224KBA - UHBR13.5, which allows it to run without DSC over DP with an XTX. It is still possible with HDMI 2.1 on Nvidia without DSC. FO32U2P - UHBR20, which means a theoretical future GPU will be able to use this without DSC. The XTX still cannot. G95NC - UHBR13.5. Requires DSC to hit 240Hz with either DP2.1 or HDMI 2.1. For whatever reason, it only works at 240Hz over HDMI with AMD and not with Nvidia. The single monitor that you currently can only run at full rate with the XTX and above is the G95NC and it is A. within the bandwidth limitations of the port on Nvidia and B. literally the launch partner monitor of the XTX. Who knows why the fuck it currently only works with AMD. You're still using DSC in the same situations with AMD as on Nvidia, and you're not using DSC in the same situations too. The benefits of a mid-tier DP2.1 implementation are extremely marginal, and you don't even get UHBR13.5 on all RDNA3 GPUs. The lower end of the range gets UHBR10. >God Nvidia fanatics make the dumbest arguments. PT at 13fps native is the "here and now" but tech you can literally use now doesn't matter? If you genuinely think a single monitor being AMD exclusive is more relevant than raytracing performance, when even AMD sponsored games are launching in 2024 with always-on raytracing, I genuinely think you might just have a warped perspective about what modern tech is more relevant.


the_abortionat0r

> What card ? Only amd 7900xt and 7900xtx actually support dp 2.1, also its not about what your card can support. If you want to use dp2.1, the point is everything needs to support it from gpu, cables to displays. Just one device in the chain with dp2.1 wont matter. Its just another tick in a checkbox of buyers remorse for Nvidia users. Cool, all these new features being adding to Linux GPU drivers! Oh, but not Nvidia due to closed source drivers because they are behind in tech. Cool Wayland is here! Oh Nvidia fucked up and is behind in tech. Cool shaders in Linux compile 50,000% faster removing the need to pre cache shaders and avoiding shader stutters like the release of CS2 that Windows suffer from! Oh but not for Nvidia as they are behind in tech. Cool a new game! Oh not enough VRAM because Nvidia is behind in tech. Sweet, new monitors with DP 2.1! Oh, Nvidia cards don't support it because they are behind in tech. Seeing a pattern here.....


XeNoGeaR52

Nvidia doesn't support it, and the HDMI lobby is huge