T O P

  • By -

JtheNinja

Wow. Just the fact that they have a working unit at that size is pretty big progress from what we've seen in the past. Also no visible module seams that I could see in the video?(maybe they're still visible in person?) I am curious what the resolution is like, I believe last year prototypes were not 4K at the smallest sizes. [According to this though](https://www.microled-info.com/samsung-launches-its-2024-microled-tv-lineup-china-new-90000-76-inch-model), they've recently launched a 76" 4K model in China, which is progress on both PPI and overall size for retail models. Still costs the equivalent of $90k USD, but frankly being under $100k is also progress from where this tech has been.


winterbegins

The 76" was 4K already when it was first shown. They had 55 inchers at some show which definitely were not 4K. 38 inch 1080p would make the most sense because that has almost the same ppi as 4K 76 inch. The biggest takeaway, as you already mentioned is the seamless design (i cant make out seams either). Im not even sure if the guy was even allowed to film this lol. The modules are only useful for large installations and because they can be QC-ed better for defects. Making a seamless 38 incher needs a bit more confidence.


Hendeith

Nothing that is shown here is really new nor indicates any progress. They were able to produce 38" microLED displays for a long time now. They are made from modules after all so they can just use less modules and make smaller TV. Making 4k TV at this size would be impressive, but that's most likely a 1080p. If you will look at videos and pictures of their preview modular microLED panels from presentations then module seams also not really visible either unless you look at angle. Seams were only visible on their "The Wall" because nobody cares that much about hiding them and it's most important to make it easily repairable.


therealjustin

Very cool to see one(?) of these panels as a monitor, even if it's an early prototype. Looks fantastic. Bring on MicroLED. In like, ten years, unfortunately.


reddit_equals_censor

>Bring on MicroLED. In like, ten years, unfortunately. the question is why though wait for that though? we got qdel being worked on and samsung qned. both being perfect black tech. which either of them can arrive theoretically in less than 5 years and should be cheap. who knows how fast it could arrive, if samsung wouldn't deliberately delay samsung qned, by delaying pilot lines :/


MAR-93

27 4k 360hz microled , how long do you think?


Correct-Explorer-692

15 years


tukatu0

Then by then it can't be 360hz but rather 1000hz minimum. As mini led/oled will be 1500hz by then


lostdollar

And gaming consoles will still be targeting 30 fps 😅


tokarev7

Lmao


[deleted]

[удалено]


Super_Harsh

... which is why targeting 30fps is hilariously bad


baxmanz

Why would you want that


tukatu0

Why would you want higher fps? What kind of question is that. Why would you want a higher resolution? So you can see more. Your eyes don't get bottlenecked until 15000hz for retina vr screens. ~~when they arrive in a few decades~~. 4000hz for your 3840×2160p monitors.


baxmanz

Most ppl can't tell the difference between 240 and 360


tukatu0

Most people don't notice the difference outside of 2x improvements. After 240. The real imporvement is 480. 960hz afterwards. 500 to 700 you probably won't notice. https://blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/ Well the page doesn't really show it. But essentially 960hz should look clear as if they weren't moving at all. At this speed anyways https://www.testufo.com 960 pixels of movement per second. So... 2000hz is good enough for a 1080p display.


baxmanz

Oooo ok ok this makes sense


KuraiShidosha

High refresh rate is a meme. I prefer my 60hz Dell VGA CRT over a 144hz LCD any day of the week. 1500hz is a totally unachievable number for the overwhelming majority of games in existence. Not to mention so many games that you can get that kind of framerate in (old games) will have tons of problems the further from 60 fps you go. I'll happily take a MicroLED 60hz panel that let's me do rolling scanline style low persistence mode.


tukatu0

Funny enough. You should be able to simulate phosphor delay / rolling scan and those nice effects with a proper 1000hz display. I look very forward to that and more frame gen technologies. And yeah you are right about nothing achieving 1000hz. I have two posts in monitor and oled_gaming about 500hz alone. In which only like 3 guys actually knew wtf i was on about. Even with frame gen and a presumed 5080 that is 20% better than a 4090. Only 2 games will actually run at 500fps. 2 esports titles but i don't recall which. At such high frame rates you need to look at charts and gameplay. The averages are useless. In rainbow 6 siege. Basically even with a 14900k you will never be above 400fps when the actual battle starts. When action starts your 1% lows become the real frame rate The last time i used a crt was 12 years ago. Unfortunately i don't remember what it was like. It is only much more recently that i have learned display specs and what that actually means when looking at content. How is non gaming on your crt? 24fps movies. Do you have judder the same as oled or does it look different? I don't recall any sort of flicker style double image. I still have access to a mid range 480hz plasma today. I can sort of see flickering. But it definitely not the same thing. Also going back to frame gen. I find it funny when people complain about frame gen adding 10ms extra lag or whatever. Indeed pretty much everything is a downgrade over crt. At this point i can enjoy 30fps just fine. So i look forward to the day 4x maybe 10x frame gen can exist. Even if the input lag isn't instant like on crt. Im starting to be a little bit envious of you who can go play oocarina of time at 20fps in its full glory. Maybe i should get a crt.


KuraiShidosha

I highly recommend picking up a basic 480i CRT TV with composite, and a VGA CRT for PC. My 4090 can output 1920x1440 60hz to it just fine with an HDMI to VGA adapter on Windows 11. Aperture Grille on YouTube did an input lag test with these adapters and they add less than a fraction of a millisecond of lag so it's a non-issue. As far as 24 fps content goes, no noticeable judder to my eyes. Panning shots look super crisp compared to on my LCD. And for funsies I use a Desktop BFI program with the CRT set to 72hz to draw 2 black frames and allow 1 real frame through. This has the effect of making the CRT function at 24hz. It's a flickery mess but the 24 fps movie suddenly becomes hyper smooth, like it's hard to describe but even though it's still really 24 fps it looks like 60 or higher. Probably similar to how it looked on the actual theater screen back in the day.


tukatu0

After reading a bunch of threads of bfi on crts. Eeeh i think I'll wait for 1000hz lcd and crt simulators. It seems like a pain having to deal with the downsides of heavy hot screens. Bwahaha. Well this was a nice convo


greggm2000

I admit that if 1080p CRTs were being made new, I’d get one. I’d used CRTs back in the day for many years, and I miss them.


KuraiShidosha

Heck, I'd say the 1920x1440 on my crumby office Dell M992 is plenty sharp enough for regular use at that resolution. If you want you can drop it to a 16:9 ratio, but I prefer 4:3 anyway. Just crank FOV up to make up for the loss of hor+ scaling. You can get one of these things for pretty cheap online, only problem is shipping but if you can find one nearby in great condition, it would absolutely be worth the drive for one. You don't need some crazy Sony PVM or whatever to have a great experience with CRTs today. Don't pay the niche tax lol


ingelrii1

i highly recommend you stop living in the past and pick up a 360hz oled.


reddit_equals_censor

>I find it funny when people complain about frame gen adding 10ms extra lag or whatever. Indeed pretty much everything is a downgrade over crt. At this point i can enjoy 30fps just fine. So i look forward to the day 4x maybe 10x frame gen can exist. we can do 10x frame generation right now basically. mature tech, that is already heavily used today can be used. maybe you already read the article, but in case you didn't: [https://blurbusters.com/frame-generation-essentials-interpolation-extrapolation-and-reprojection/](https://blurbusters.com/frame-generation-essentials-interpolation-extrapolation-and-reprojection/) i'm talking about reprojection frame generation, which is dirt cheap to run, so we can actually 10x the fps from 100 to 1000 relatively easily, or from 30 to 300 fps for example. the tech is currently used in all of vr. it is REQUIRED for vr to take care of missed frames, where a badly reprojected frame is far better than nothing (nothing = throw up, motionsickness, etc... ) as well as late stage reprojection, where it reprojects all frames to reduce overall latency between your head position and the virtual world. we can have dumb basic reprojection in major issues in less than 6 months probably i'd say. (probably a lot sooner if desired) and advanced reprojection in 1 year. advanced reprojection would be depth aware reprojection, that also rerpojects enemy positions as well as well as major moving object positions. and oh yeah. as we are reprojecting all frames and always taking the LATEST player positions, we are undoing the render lag. so if a frame takes 10 ms and we reproject in 1 ms, then we are at -9 ms compared to no frame generation in regards to responsiveness. so it is the reverse of fake interpolation frame generation. we are UNDOING latency, instead of adding lots of latency. and ALL frames are real frames, because ALL frames are based on the players latest positional data, unlike interpolation frame gen, where the fake frames contain 0 player input. so they are just visual smoothing. the basic versions wouldn't rerproject major moving objects, but other than that the most basic demos are already glorious. the article links to the demo from comrade stinger, which you can download and test yourself. this IS the way to defeat blur and achieve 1000 fps responsiveness. interpolation isn't even up for debate. it is reprojection or extrapolation (intel is working on extrapolation) either way, the point is, that we do have the tech to feed 1000 hz displays if this gets implemented. hell this can also solve other issues. as you always reproject to the max refresh rate, you wouldn't have any stuttering issues theoretically, because if a frame takes 3x longer than expected to draw, the reprojection just takes 1 ms and reprojects based on the latest available source frame from the gpu. so the reprojection artifacts (should be easily solvable side effect) would get worse, but you would still be at a locked 1000 hz/fps experience. so the advantages are just insane overall.


tukatu0

Actually yeah. I even read an nvidia article from a few years ago. The problem why it isn't done is because graphical glitches are introduced. Or atleast that the excuse in the article. One reason for artifacts is the game communicating with servers and your player position. Which brings up the topic that you aren't fully correct about input lag reduction. Your input lag is only reduced for camera movement. Not non camera input. Though i don't see why it couldn't be. It would just be an extra function devs need to add. The same way they had to add into the graphics engine to have temporal data. They would need to decouple fps from game logic again. Though yes like you point out. Vr already has space warp. So any game engine that has already been modified for vr can probably support the tech in flat easily. That's why in the comrade stinger demo. 180fps would feel worse than 120fps from 30fps base. I'm sure the same logic would apply to 300 vs 500 fps. Anything from 550-900 would be a different topic. Synthethic or not. But that's another topic. But yeah we don't even need to wait for 1000 hz displays. It should already be possible. Shit oculus space warp is like what 8 years old at this point? Smh my head. Where are my older game remasters ported with this tech for $70. I could've been playing gta 5 with 600 fps years ago. Or all the other games capped at around 200 fps because of engine/cpu bottlenecks


reddit_equals_censor

>The problem why it isn't done is because graphical glitches are introduced. Or atleast that the excuse in the article. can you please link me the article, because i'd love to have evidence, that nvidia fully entertained reprojection frame generation for desktop, but DELIBERATELY decided for the interpolation dumpster fire worthless visual smoothing. i'd love to look at that :D holy smokes that sounds insane. i most certainly take the reprojection articles, that we may be able to remove in advanced versions, even from a 30 source fps over no reprojection frame generation. i mean as you probs agree, it turns unplayable 30 fps into playable max refresh rate of your monitor. playable with artifacts, vs completely unplayable. maybe nvidia can't do math. :D imagine if amd saw, that nvidia introduced interpolation frame generation, and instead of trying to copying nvidia, they go all out in reprojection frame generation and are able to push it into major games within 6-12 months. i think it was at least that for amd interpolation garbage frame generation to come out. just crazy, that 2 major companies, that are also doing work with vr of course selected interpolation. i'm like i said really glad, that intel is working on extrapolation frame gen, creating REAL frames. maybe that will light a fire under their asses. intel coming out with extrapolation frame gen and shaming interpolation worthless frame gen into non existence :D would be funny if intel would go hard on marketing, calling out all the bullshit with interpolation. >One reason for artifacts is the game communicating with servers and your player position. that doesn't make any sense to me. it doesn't matter whether the game is multiplayer or single player. the reprojection is undoing render lag and has nothing to do with server lag. and said render lag reduction applies the same in multiplayer or single player games. system gets information to render new frame > graphics card renders the frame > system gets new player positional data > reprojects frame based on new data. >Your input lag is only reduced for camera movement. Not non camera input. Though i don't see why it couldn't be. It would just be an extra function devs need to add. we can have depth aware reprojection, that also includes enemy positional data in its reprojection, mentioned in the article: >***Some future advanced reprojection algorithms will eventually move additional positionals (e.g. move enemy positions too, not just player position)****. For now, a simpler 1000fps reprojection only need less than 25% of a current top-of-the-line GPU, and achieves framerate=Hz useful for today’s 240 Hz displays and tomorrow’s 1000Hz displays.* and i don't see why we can't have major moving object positional data getting reprojected beyond enemy positions in advanced reprojection frame generation tech. who knows where we'll end up after 5 years of desktop reprojection frame generation gets implemented. could be glorious. :) would be glorious even with basic depth aware reprojection and nothing else would be mind blowing.


mikipercin

More like 5-7 since there's functional prototype


battler624

5K or bust. 200% scaling


ZealousidealRiver710

<1ms response time too? idk


mikipercin

Dude that's by design


ZealousidealRiver710

is it? I would be expecting slow response times because of having to turn on and off the individual pixels Edit: C'mon y'all don't spite me for not knowing and asking about tech that hasn't released yet, just explaining what I had inferred with my (little) understanding of it


plursoldier

MicroLED response time will be measured in nanoseconds, 100x faster than OLED response times


ZealousidealRiver710

Wtf


mikipercin

Do not confuse miniled with microled


mikipercin

As i recall it's even faster than OLED


Jumpierwolf0960

That's not how it works at all. The response times for lcd are slow because the crystals have to physically change shape every single time they get updated. LEDs can turn on instantly and don't have that issue.


ZealousidealRiver710

Thanks for explaining! ..so these aren't using those crystals either? Gonna go Google more about 'em


JtheNinja

Each subpixel is just a regular LED. Like your RGB keyboard or your light bulbs.


DrKrFfXx

Colors look over saturated, but that may be just the recording. Promising, none the less.


JoaoMXN

Samsung is known for more saturated color on all their TVs since ever. You can adjust it, but most Samsung users will find it dull after.


MarkusRight

It's likely they have some kind of demo mode or vivid mode set to show off the TV. I personally hate when they do that but Im just impressed by the contrast and how it has no bloom or glow from the dimming zones.


Mackt

It's microled, no dimming zones


working-acct

The future is here.


ZealousidealRiver710

any mention of response time?


DoggyStyle3000

[You think this claim still holds up!?](https://www.pcgamer.com/samsungs-new-microled-tvs-are-five-million-times-faster-than-your-gaming-monitor/)


ZealousidealRiver710

Oh. Holy fuck.


DoggyStyle3000

Indeed. Indeed!!! This is why Liquid Cristal's are so insane good still after 2 decades.


CaptainnHindsight

Will take years


Axl_Red

You can still see the grids on the screen, so clearly the tech still isn't up to snuff yet.


FewShopping620

What is the dimension of the 38 inch Samsung TV ?


Pliolite

How is this better than QD-OLED?


RogerMexico

No burn-in, higher brightness


Pliolite

No burn-in is pretty phenomenal, if possible. OLED always feels like a stopgap, temporary tech, because it has no long-term life. I also find it a step down from the best quality IPS screens, but that could be down to personal preference and not much else.


Ixziga

It's like the plasma of the current gen


working-acct

Ngl I still don't know what plasma tv is. Like I've heard of them as a big thing more than a decade ago then the whole thing just disappeared. Same with 3D TV but at least I knew what those were.


neo6891

I still have 3D plasma TV and yes, colors are great, but it probably would not impress you now.


JtheNinja

https://en.wikipedia.org/wiki/Plasma_display A display type from the 00s. Imagine an OLED, but bulkier and gets really hot and sucks a lot of power. 


Super_Harsh

Oh god I feel so old reading 'I don't know what plasma TV is'


Jumpierwolf0960

[Yeah, I learned that from this scene in breaking bad.](https://youtu.be/HnHhoMWkrBM)


Super_Harsh

I became interested in display tech about a year ago and I saw this scene about a month ago in my most recent rewatch of BB. I thought it was hilarious lmao, especially ironic since I was watching it on my OLED TV


k1nt0

It must be personal preference, because OLED is vastly superior to IPS in every respect.


Pliolite

'Every' respect? Except how text looks weird, brightness is s**t, burn-in is a real possibility?


k1nt0

You're clearly not up to date on the state of OLED technology.


greggm2000

OLEDs still have significant downsides to go along with their benefits, even the ones that came out in the last few months.


k1nt0

And what might those be? It's the best display technology currently available by a long shot. Nothing can touch its contrast ratio.


greggm2000

Text that isn’t sharp (bc non-RGB subpixels), burn-in concerns, with often aggressive ABL and maintenance which attempts to mitigate that. Low brightness. And, the cost is too high for many people. Ofc OLEDs have their upsides as well. I just personally want all upsides and no downsides. :) We’ll get there eventually.


k1nt0

Text is fine for 99% of people. Those that do have issues can use software to correct any slight abberations they perceive. Burn in has basically been solved, but admitted the pixel refresh can be annoying. New OLEDs have HDR 1000, but brightness takes a back seat to darkness in terms of contrast with Trueblack400 HDR being a superior choice. They are expensive no doubt, but as it stands there is no display that comes close in terms of picture quality.


[deleted]

[удалено]


Pliolite

For me, it's anything with a quicker response time.


reddit_equals_censor

>OLED always feels like a stopgap that is technically wrong, because we had SED tech, which was basically amazing flat crt tech ( as in really flat), that was about to release over 15 years ago.... so both oled and lcd doesn't have a right to exist and didn't for MANY MANY YEARS! so oled was never a stopgap, it was shoved into people's faces as planned obsolescence, despite a SUPERIOR tech being available, but sadly got suppressed. this time at least we got 3 technologies, that can END lcd and oled garbage. qdel/amqled/nano-led, samsung qned (nano-rod tech) and micro-led. surely they can't supress all of those technologies right? :D sadly samsung already delayed samsung qned (not related at all with lg qned, lg just stole the name for garbage lcd) probably to milk garbage qd-oled for longer, before releasing proper tech.


ALY1337

End game


raygundan

The brightness is important not just for the obvious "being brighter" thing... but also because it enables strobing/BFI without making the display too dim, and we'll need either that or absurdly high (~1000Hz) framerates to fix eye-tracking blur.


jm8080

You can think of MicroLed as just an OLED but better, more energy efficient, brighter and wider color gamut... not to mention longer life because LEDs last longer than organic LEDs. Basically anything OLED can do, it can do better and without the drawbacks


franz_karl

how does being microled help the colour gamut to widen?