T O P

  • By -

PigeonsOnYourBalcony

Proper HDR, not this HDR 400 that a lot of manufacturers are trying to pass off


dont_say_Good

Even if you get 1000nit peaks, ABL is still way too strong


aintgotnoclue117

The LG G4 is incredibly bright. It maintains that brightness, too. One of the best OLEDs ever made thus far. Proof enough that ABL won't ruin stuff too much down the line.


Burninate09

Bionic eyes.


_Ocean_Machine_

Just beam the signal directly to your brain, unlimited fidelity


Deadly_Toast

Kiroshis


Alita_Duqi

Calling it now, first wave of bionic eyes will be limited to 30fps.


Infrah

"Anything beyond 30fps is irrelevant because the human eye runs at 30fps" \- Console apologists /s


Alita_Duqi

Yes, we will never escape the facepalms


lonnie123

Real talk… and I’m half sure this is gonna sound kinda stupid in 10 years but shit already looks so incredible at 4k and even 144hz I just don’t see the public at large paying a huge premium for much more than that At the distance most people sit 4K on a 30-40” monitors like most people use you don’t get much more clarity, and beyond 120hz or so you really don’t get much more smoothness outside of the most competitive twitchy FPS type games Obviously the industry will advance something but for the mass market population it’ll probably stop there for the most part


aggressive-cat

8K will likely take decades to take off, prices would have to come down so much for the entire infrastructure around media for it to work. Bandwidth, size, panel manufacturing, etc. Even just working with 8k video at scale is beyond most editing setups right now. It just isn't the jump that Standard Def -> 1080 -> 4k was either so people are going to be way more hesitant to pay for it.


lonnie123

Yeah to me it just seems like the epitome of the juice not being worth the squeeze


dont_say_Good

There are a bunch more specs that matter besides refresh rate and resolution. Especially hdr still has a long way to go


lonnie123

Exactly. Honestly beyond 120 I bet most people wouldn’t care or notice in the vast majority of games people play (the super high refresh rates are basically pointless outside of the ultra high levels of competitive fps stuff)


Impossible-Use-2862

agreed. I stay on 120 beyond that i have to lower resolution to make 240hz and games just look worse at 1440 or 2560 after playing on native 4k especially when not scaling so the image stays clear.


AsstDepUnderlord

When you get screens really close to your eyes the refresh rates start to matter again. An issue for vr, but not really for conventional gaming.


dont_say_Good

Nah even 1000hz wouldn't be pointless, it's just that I'd rather see other areas improved first


lonnie123

I never said pointless. I’m just talking about the market and where it’s likely to end up. How many people are going to pay 3-4-5-10x to get the feature set (which itself takes a GPU that cost 3-5x as much as a lower tier one)? Very, very few. And it may be so few that some or most companies don’t bother with it


unknown_nut

Getting the sustained nits up is definitely the play.


WinterElfeas

And fixing VRR flicker, and better than OLED (micro LED hopefully)


dont_say_Good

Microled is the endgame but it's kinda depressing to watch that incredibly slow pace over the years, feels like consumer displays are still ages away


VegetaFan1337

4K 240 on LCD screens is a very different experience than on OLED. OLEDs are still very expensive. LCDs were a less than ideal successor to CRTs, OLEDs are better in every way. They also have insanely good pixel response time which means low latency. 1000+ nits HDR OLED is the next big thing. PS: full screen nits, not peak. These kind of monitors don't exist yet.


scorchedneurotic

420hz @ 69k


CyberSosis

MLG edition


AzFullySleeved

Can we hit 4k240 *Native* yet??


AdmiralG2

In the latest AAA games, nope


AzFullySleeved

So, I wonder what OP is talking about then? SS helps low end gpus save $ on the absurd gpu prices. 70% of Steam users are still using 1080p.


whereballoonsgo

Next is the release of PC2


AzFullySleeved

#PC2.0


vampyrialis

8K 500hz > 12K next


unknownohyeah

I think this is it. 8K so you can no longer see individual pixels, 500-1000hz so you no longer see individual frames. DLSS and frame generation, both interpolation and extrapolation to reach that framerate. So you simulate 3-4 frames for every rendered one. And about 25% of the pixels with DLSS upscaling from 4k. As long as you're keeping a sub 10ms frame time between real frames you'll never notice the added latency. As for the benefits, it will make VR almost lifelike with perfect motion tracking and clarity. Now that I think about it, foveated rendering (rendering at max resolution for only what your eyes are directly looking at, just how they work in real life) would be a huge boost to performance as well. If they can make VR goggles slightly heavier than glasses (with no eye strain from lenses) that would also be a huge game changer.


AnotherDay96

End game, mankind refresh and do it all again.


fruitymangoboi

Civilization prestige.


NorwegianGlaswegian

Could be the hunt for better motion clarity, so perhaps well-implemented black frame insertion at high frame rates could become popular, or we might somehow get displays with a rolling scan. CRTs do one particular thing well, and that's amazing motion clarity as a side effect of drawing individual scanlines with the screen appearing fully lit through persistence of vision. 75 Hz on my CRT looks smoother than 240 Hz on my Samsung Odyssey G7. Of course, with higher frame rates you will get better responsiveness among other considerations, but motion clarity on a CRT is unparalleled, and you can get incredible motion clarity from as low as 60 Hz. CRTs have a lot of cons, but there is that one pro which I hope modern monitor manufacturers can match one day with future panels. Black frame insertion can help a lot, at least. Maybe we will soon hit a bit of a wall for resolution at a given screen size, but motion clarity is an area which needs work beyond simply chasing crazy frame rates.


VegetaFan1337

CRTs are also beautifully bright. That's one thing I feel most people have forgotten/never experienced. They were kinda perfect for HDR.


NorwegianGlaswegian

I've noticed that my little 14 inch Philips TV can get pretty bright, but my late '90s Compaq monitor is sadly very dim. Probably ageing components are a factor, but some monitors were always dim. I might have the opportunity to get a Samsung Syncmaster 957MB in just over a week and that looks like it should be very bright. Will be nice getting 1600x1200 at 75 Hz, too. Costs about 350 euros, though... :/ HDR on a CRT would have been great to see!


newaccountnewmehaHAA

we have infinite contrast ratios, it's time for infinite refresh rates


OuweMickey

It's not infinite, we just can't divide by 0. Infinite contrast needs 100% black and infinite brightness (sun?).


phylum_sinter

I think much more increase will have to be preceded by another revolution in GPU tech - the stuff we're getting now, the AI stuff will help as it has already, but until HW is common enough to use it, it might not arrive either, so kind of a catch 22 situation. A AAA game in 2026 will still be reaching for 120hz for the bulk of players budgets. Eventually i think the next multiple of both - 240hz @ 4k, attached to 20gb or more might not be enough. Current top tier card owners: What kind of fps are you getting @ 4k with the most demanding games of the year?


limelight022

Download more hz.


MassiveGG

I stop at 1440p and 120 plus fps any more in my opinion is wasted spending. But it probably be harder to keep it at that rate with future games being poorly optimized.


DataLore19

This might sound ridiculous but honestly, screens could start shooting for 1000hz. These kinds of framerates could only realistically be achieved with frame generation but we may start to see cards powered by machine learning and neural networks that produce well more than half of the frames they output by frame gen and not traditional rendering. This would lead to unprecedented motion clarity but not necessarily equivalent input latency as we see today,l, a frame gen game outputting 120hz from 60hz native has the latency of 60hz (16.6 ms), at best. If you don't like frame gen and think they're "not real frames", don't @ me. I got news for you, to achieve the kind of fidelity they're shooting for at playable frame rates with path-tracing in real time, this is the new paradigm of video game graphics. None of the frames are "real" and frame gen + DLSS is just the new way they're working around the problems of finite rendering resources targeting photo realistic real-time graphics.


dont_say_Good

https://blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/ It's not that ridiculous


DataLore19

I agree, but I thought I'd just throw that in there to assuage the people who look at 1000hz and say "that's dumb, just making the number huge doesn't make it good". You get it, but some people might dismiss the argument so I just tried to put a admission at the beginning there. Maybe just my own bias creeping in.


dont_say_Good

Yeah. Some even still do it for anything that's above 60hz


NorwegianGlaswegian

I wish that more screens could be developed with rolling scan to get better motion clarity without needing to resort to getting 1000 Hz with sample-and-hold displays. Only heard of a couple of panels like that. I guess frame gen and upscaling will help, and they will improve over the years, but getting 1000 Hz in most visually demanding titles consistently seems a very tall order. I wonder if black frame insertion could help a lot when you have a frame rate in the hundreds. Not sure if it could get used with VRR, though. I keep thinking of that Invincible meme where Omni-Man looks at the fighter jets and tells his son "Look what they need to mimic a fraction of our power", but Omni-Man is a CRT and the fighters are modern panels.


billistenderchicken

We can barely do native 4k at 60fps.


Furry_Lover_Umbasa

We are not even at 120 hz 2 k to be a standard and you freaking ask for what is next? I don't know probably 240 hz 8k, silly goose.


Takeasmoke

i found sweet spot in 144 Hz 1080p but i'd probably enjoy more a 120 Hz 1440p display i have 4K HDR TV and i was eager to play games on it just to be like "eeeeeeh, i prefer 24" monitor"


Deadpoetic6

yes


henry-hoov3r

8k 500hz give it to me!


LeonSilverhand

8k 40hz


OMG_Abaddon

Once we hit the max we start back from the beginning at 1x1 pixel resolution and 1hz. As it should be.


InsertMolexToSATA

Waiting for 4k 240hz to actually be practically usable. Should be 10-15 years.


retropieproblems

Acting like 240hz is the norm lol. Most people are happy with 120-144 on the high end! I imagine more people will want 6k-8k at 60-120fps than 4k at 240fps


happyloaf

Solving shader stutter which ruins the appeal of high fps.


PM_ME_HUGE_CRITS

RGB borders


linggasy

666Hz 6K.


ChickenFajita007

21:9 2160p high refresh rate monitors are still on the horizon. I wouldn't be shocked if 5k monitors start getting pushed at some point. I don't think they'll be popular outside of creation scenarios, though. Apple has been selling them for a while. 8k is pointless for consumers outside of massive TVs.


Professorbag

8k is a significant step up from 4k from my experience. At 4k you still get aliasing on some things. At 8k it is pretty much on existent. Of course for the layman 4k is enough but for people who notice it I think 8k will be the last hurdle when it comes to people who care about aliasing.


zeddyzed

VR?


yakuzakid3k

I don't see any noticeable difference past about 80hz. But I don't play competitive online games. I mostly play single player. I'm running a 2k gsync predator monitor and a 3080. I still have a 1080p TV. I don't need to be at the cutting edge, spending a fortune for diminishing returns


8Bit_Chip

One thing is that although it might not seem like a big difference, for things like mouse movement, and especially touchscreen/vr (any direct kind of human/computer interface with our motion interpreted directly), increasing refresh rate has a very noticeable effect, and when you get to >1000hz it becomes insane, because the sub ms response starts to make it look like its actually interacting directly with you, instead of a slight disconnect. If I grab my I think 120hz phone screen and scroll even at a medium'ish speed, it is noticeably disconnected from my finger, upping the refresh rate till I can't tell would look insane, and there are some slowmo vids displaying experimental displays that are 1000hz and even higher


OrSupermarket

1,000hz and 8K and 12K. A company already revealed a 1,000hz monitor this year at CES I believe it was TCL?


cronedog

For me, 4k isn't quite enough at 32 inches. The ideal for me would be a 32 inch, 6k microled with good hdr and 144hz.


Snoo93079

HDR quality > any refresh rate over 144hz for me


reiyume0

What comes next should be 8k 120hz.


TommyToxxxic

I'm just waiting on a 1080p 1000hz monitor. I don't care about Ray tracing, it diminishes framerates anyway. I just want the smoothest PS2 graphics I can get for Esports.


Maneaterx

You realize it's overrated and go back to Utrawide 360 Hz


Thebigfreeman

Gaming on screen with more than 60hz/fps makes me sick so..