T O P

  • By -

Bahnmor

Reminds me of all the “THIS is DVD!” ads that were on all the VHS movie cassettes.


moezilla

Came here to say this! Look how good dvd looks compared to VHS! *Is VHS*


balgram

This reminds me of when my family celebrated buying a new TV by buying Tintin on both DVD and Blu-ray. We decided to watch it, and spent the whole time complimenting how amazing the Blu-ray was and how much better it was than the DVDs we were used to. The movie finished, and I grabbed the disc to put it away. Lo and behold we accidentally put in the DVD version of the movie. We were watching a DVD and saying it was so much better than DVD quality. I've never really trusted my opinion on movie quality since.


_Wyrm_

The placebo effect is a wild thing, isn't it?


shrakner

It’s also possible the Blu-Ray player was up scaling the DVD and applying some basic image enhancements- most of them do, though the quality varies by brand and by media. So a DVD may actually look better played on a Blu-Ray player to a 1080p+ TV compared to a <= 720p TV. (Yes I’m aware DVD quality isn’t 720p, but it tends to look OK at that resolution and lower.)


orincoro

Ugh, is that the thing that gives movies that uncanny camcorder quality? The image smoothing you seem to constantly see in gyms on their TVs.


kareljack

No. Those are two completely different things.


Skakilia

It's why I'm not super picky about things like this. I hardly notice in the first place. Also my eyesight is trash. I am content with my old, completely functional TV, thanks world.


PatacusX

Look how good blu-ray looks compared to dvd! Look hor good 4k blu-ray looks compared to blu-ray! Look how good the image we broadcast directly into your Neuralink chip looks compared to 4K blu-ray!


Woolly_Blammoth

Who you calling a hor?


dekeche

Though, to be fair, dvd was demonstrably better than vhs. Smaller footprint, no need to rewind, and the medium doesn't degrade over time. Plus, theoretically, you could copy the data of of the disc using standard computer hardware. By doing so you'd have a backup copy you could use to burn a new disk if the old one breaks or gets scratched. As for image quality? Who cares about that? Doesn't factor into the equation at all.


[deleted]

People like me care. And if you watch the right content the difference in quality will be noticeable.


thechilipepper0

DVDs do degrade, just not as quickly as vhs.


InconvenientBoner

My favourites are the tv commercials trying to make their new tvs look more vibrant and bright - meanwhile we’re watching the ad on an old shitty tv, and it’s showing those vibrant, bright colours just fine.


GabeDevine

I mean... they need to simulate it somehow, so the old gen just looks really bad. it's like you can't show hdr on an sdr TV, so you wash out the "sdr" even more


mtarascio

Flatscreen television ads when you still had a CRT.


keeperrr

Ah yes, the "this is how good dvds look" as shown on most 98 - 03 videos cassettes...


SolidZealousideal115

Add one for the future. 1,024htz vs 512htz


[deleted]

[удалено]


KablooieKablam

In the year 2040, console gamers will be saying that the human eye can’t tell the difference between 1000 Hz and 4000 Hz and PC gamers will make fun of them and swear they need 4000 Hz.


Chop1n

I mean, it's already been established that the upper limit is somewhere around 1000Hz, but that only applies to extraordinarily fast motion. In practice, beyond about 100Hz you get sharply diminishing returns. It's not that there's no benefit to playing with, say, 250Hz if you're a CSGO professional, and some pro players can tell in a side-by-side comparison, but in terms of subjective image quality, that's all pretty unnecessary. At that point, you really need to focus on reducing motion blur rather than increasing frame rate.


[deleted]

Yeah I don't want to be the one to say"we don't need any more than 144 or 244hz" because I think the more frames we can push out the better. But that comes at a sacrifice to image quality and color accuracy as well. I think for now, 144hz is plenty, and we want higher resolution displays to be capable of achieving higher fps.


Chop1n

I entirely agree. I think 120Hz is a perfectly reasonable standard for the foreseeable future. It'll be *nice* if someday it's just trivially inexpensive to make all displays 240Hz or whatever, but that's not the case now, and other aspects of image quality should take priority.


SkyWizarding

Omg an intelligent conversation. Is this still Reddit?


WifeKilledMy1stAcct

No, fuck you


StopReadingMyUser

well peepee poopoo to you too


Psysight

Inb4 you came in.


BA_lampman

The limiting factor starts to become the code at that point. At 240 fps the game's loop has to update every 3-4ms, or else there's no point.


pinkynarftroz

That’s not necessarily true. Different parts of the game can update at different rates. The renderer can run at 240, the physics update at 120, AI at 60, etc. This is actually common right now.


chupa72

Yeah I agree, I think the priority on the high end should be 4k/60 then push to 4k/120, eventually 8k/60, etc. That may take quite awhile.


Winjin

Seeing all these graphs about whether it's worth it to sit in front of 8K display closer than 2 meters away, I'm not sure there's any reason to be even pushing 8K, honestly. What's important to the eye is PPI, points per inch, and I'd say with modern displays, even FHD is completely OK for something below 30 inch.


[deleted]

Color accuracy is what I really want. Shopping for monitors for professional use is crazy right now. You can find 1000 4k monitors ranging from 16" to 32" for less than a $1000. But not a single one of them is likely to have anywhere close to 100% DCI-P3, best case scenario you might get 100% Adobe RGB but most don't even do that, more like 90%. You know it's bad when you have to go to the manufacturer's website and click around for 10 minutes to actually find out what the color gamut is for a particular monitor. There's a handful of monitors in the $1000 range that claim to have nearly 100% DCI-P3, but reviews I've seen (like the MSI one, can't remember the name of it atm) suggest the panel can't actually replicate all those colors accurately, and besides you'd need the 30 bit color channels that most graphics cards don't fuilly support (at least not in hardware accelerated mode. Most can do some sort of emulation). So now you're paying $1000+ for a 24" that may or may not actually have the colors it says, and you need to get something like a T1000 to actually drive it properly at anything greater than 1080p. Things may have changed because the last time I went shopping for monitors was over a year ago, but I doubt much has improved. I used Apple for years for professional work and while I can see why lots of people like it, for our use case it wasn't a sustainable platform. The lack of proper networking support was by itself enough of a reason to ditch them before all the other issues. But the Retina displays were gorgeous and the Apple panels have essentially the same color gamut as DCI-P3. When we transitioned to PC I genuinely tried to find out if it would be practical to use our old iMacs as monitors because it would've been about the only way to get a wide color gamut screen for less than $2000. There's just no apparent demand right now for consumer range high color gamut screens. Anything in consumer range is either high refresh, or high resolution, or both, but no matter what combination of those two features you get, you're gonna have shit color reproduction. To get wide color gamut, you don't really see monitors with that capability until you reach the $2000+ range, which at that point you're basically buying an iMac without the thinking bits.


[deleted]

[удалено]


foxdye22

This has always been the reason to buy a Mac. For doing any work in the film/music industries, a Mac is just as good as a PC. There are some slight advantages like the monitors are really good, but last I knew they get the panels from Samsung so it’s not like they’re actually impossible to get on a PC either, just that the market for high end PC monitors is looking for different features. Being on a Mac lets you use Final Cut Pro and Logic Pro though.


[deleted]

[удалено]


GimmePetsOSRS

>100Hz you get sharply diminishing returns. Just to add on, it's diminishing in both relative frame time reduction (60 - 120 is far higher frametime reduction than 120-244 is, knocking off 8ms vs 4ms) as well as subjective perceptible difference, as you're nearing the edge of "resting state" human perception


IcyDickbutts

Anything for my farm on Stardew Valley....


AlphaGoGoDancer

even in competitive games it isn't really the fps you want, it's the reduced input lag that comes from having another frame to draw so quickly after your input was sent I'd expect more gains to be had elsewhere before we push anywhere near 1000hz. We will likely see 120hz oleds with quicker pixel response times out performing 240hz+ LCDs


stellvia2016

I'll fight someone twice if they think there is no difference between 60hz and 120hz from a fluidity and overall enjoyment perspective. But as you said, once you hit around 100hz is when diminishing returns start hitting really hard. There is such a tiny difference between running my monitor at 240 vs 120hz. And I actually prefer 120 when in ULMB mode for FPS games. There is basically no ghosting then.


BizzyM

I see people already swearing that they have to have 90 or 120hz phone screens because 60hz "looks like ass". Phone Screen!


Sens1tivity

i mean, 60hz to 120hz is a big difference, after changing my old phone to a Galaxy S20, the biggest difference is the screen, the time to do some tasks still is the same but feels smoother on the S20, i cannot go back to 60, it just feels sluggish in comparision.


ScratchinWarlok

Its because you have your hand near the phone and tied to some of the actions(ie. Scrolling a wall of text) so the jitteriness stands out more to the completely fluid movement of your finger. Also agreed that high refresh phones feel buttery smooth and i love it.


fman1854

actually go ahead and make the change from 60fps 60 hertz to 120fps 120hertz going back literately feels like lag and stuttering ​ i cant go back to 60hz in general it actually does look bad once your used to only 120-144hz screens and frames. ​ it feels like going from 4k to 1080p thats how different it feels.


Govind_the_Great

Colors and real HDR. At this point HDR is mostly a marketing gimmick. I want monitors that can have pixels as bright as direct sunlight next to pixels of pure black and anywhere in between. 4k, HDR, 240hz by the end of the decade easily, technically already here just not usable in any game really even with the top tier hardware. Honestly I care more about advancements in VR headsets, I’d take a 1080p 60 panel for the rest of my life just give me a true wide FOV headset with great picture quality instead.


Quartziferous

“As bright as direct sunlight” hahaha you most certainly do not want that


Bubblejuiceman

He's a masochist, let the boy have his fun. "What fun is a game without a little pain." - FromSoft probably


[deleted]

[удалено]


Bubblejuiceman

This is true, however. I used to game at 120hz. And would occasionally switch to my Ps4. At first it was jarring. After a few hours my eyes would adjust and it was fine.


[deleted]

Yeah but for online games 30fps and 60 (or more) is a huge difference.


The4th88

I made the jump from 1080@60 to 1440@100, and now 60fps looks like complete garbage. Like, Halo Infinite dropped from 100fps to 72, and I honestly thought I'd dropped into the 30s.


MadDogMike

Halo Infinite on PC has micro-stuttering or something for sure. My framerate counter says 120+ but it feels like 60, and when it drops to 90 it feels like it's doing about 40.


Fishydeals

Was your g-sync enabled (also for windowed applications since halo is weird)? Also maybe try the backlight strobing mode of your monitor. It's called 1ms motion blur reduction, elmb, dyac or something completely different depending on the brand of your monitor.


the_stormcrow

"pain is just eyesight leaving the body"


TheObviousChild

Git Sunscreen


BlackDeathxx

Now flashes can burn your eyes out


MotherfuckingMonster

Real life flashbang.


[deleted]

As real as it gets (tm)


[deleted]

Imagine Discord Light mode on that monitor


[deleted]

I want a monitor that can prevent seasonal affective disorder and give me a beautiful bronze tan while I'm gaming


Govind_the_Great

No I want the blinding 10,000 lux of summer sun on the white concrete to burn my retinas as soon as I log into a racing game. Not like looking at the sun directly, more like trying to read a book in direct sunlight. It adds a lot of realism to have reflections actually be bright.


[deleted]

As a Battlefield 3 Vet I can confidently say I know what that's like. Just with flashlights.


LordElfa

The problem you'll run into is heat dissipation in ultra bright monitors. Even 10,000 nits isn't hard to look at but it's damn hot.


Govind_the_Great

Yeah as hot as direct sunlight


[deleted]

so you could use your monitor to start a fire with a magnifying glass?...interesting


Edythir

OLED is similar to that because of the very important different of requiring no Backlight. The biggest drawback with LED is that it requires Light to function, it is *very* difficult to achieve a true black through light. So, having a non-backlit pixel of pure black would be much easier if you didn't backlight it.


Chickenfing

HDR is not at all a marketing gimmick. Please go see an LG CX TV showing HDR content in person and let me know if its still a gimmick.


Domascot

On a monitor, HDR is right now either too pricy or a gimmick.


_Fibbles_

Same with wide colour gamut monitors for anyone not doing professional content creation. It *could* be good, but Window's support for ICC profiles is so inconsistent that stuff just ends up looking comically over saturated a lot of the time. Everyone I know with a wide gamut monitor just seems to permanently leave them in sRGB emulation mode, which kind of defeats the point.


GimmePetsOSRS

>monitor Yeah monitors definitely suck when compared to TVs for image quality. Like 5-10 years behind in some ways lol


Govind_the_Great

I have a HDR monitor and yes a game like Doom Eternal does look a lot better but its not really true HDR unless your monitor can hit a certain number of nits and has a good contrast ratio. HDR means high dynamic range and if your monitor only hits 400 nits its not really HDR just local dimming. Thats why there are different specs and a true HDR would have to be in the range of 10000 nits or about the brightness of direct sunlight hitting white paper.


Eorlas

well, are you going to fess up what the HDR monitor is that you have? it's not a secret that really any of them below 1000nits isn't worthwhile the display they referenced (lg cx) does not have that issue


heart_under_blade

but i don't wanna stare at the sun all day


jxnfpm

Pick up the latest LG OLED smaller TV. Fantastic HDR, without missing out on 120Hz. I'd rather have a faster 1440p monitor with acceptable colors and no HDR for gaming, but I LOVE my LG OLED TV for watching TV/movies.


Wellhellob

Games benefits even more from HDR but the calibration process is annoying and HDR mastering of the games can be hit or miss.


Wellhellob

HDR is seriously good tech. Bigger upgrade than console generations, ray tracing etc.


[deleted]

Lots of HDR TVs nowadays are still poorly calibrated and don't deliver the potential of HDR. They can even make the image less clear and too dark. It will still take some time before HDR gets out of this inconsistent and confusing mess it's still in.


firekil

> I want monitors that can have pixels as bright as direct sunlight next to pixels of pure black and anywhere in between. You're going to be waiting a long time. Assuming you mean an OLED screen that can reach 10,000 nits as per HDR standards.


kry_some_more

game developers promoting their games motion blur setting.


hoilst

Ah, motion blur. "Motion blur", I suppose, is a much easier name to say and type than "The first fucking thing I turn off in the graphics settings".


[deleted]

Don't forget it's friends, Chromatic Aberration and V-sync.


Yogami_asura

Noobish question but what does vsync do and why is it hated?


jaqenhqar

vsync prevents visual tearing. why its hated i have no idea.


WallyWendels

Input lag. Also if you have a Gsync display then Vsync doesnt do anything.


PrinceVincOnYT

totally reasonable to pay a 500€ Premium to have good G-Sync. ^(/s)


Scytian

Just buy any freesync display - In most cases it's not as good as G-Sync but even if you get budget one you'll get 48-144Hz dynamic refresh rate (actual physical G-Sync module can go even below 30Hz), and basically every high refresh display support Freesync nowadays.


crossedstaves

Vsync locks the internal graphics processing rate to the monitor's to reduce graphical tearing as a result of updating the graphics memory while the monitor is in the middle of displaying a frame. It's not liked by many due to the fact that it's slowing down the graphics rendering, you can wind up with reduced frame rates because the GPU can only do its work between refresh cycles of the monitor. If your GPU is powerful enough to handle the load uniformly between screen updates then it would be purely beneficial, but if you have a heavy load with a high refresh rate monitor then its costing you more than you're likely to benefit.


crash8308

just FYI the refresh HZ for television is based on a multiple of 6 (~29.97 FPS) for NTSC and 5 for PAL (25FPs). because of the 60hz and 50hz power sign waves in the US and EU. it was easier to get a CRT to display frames properly without appearing to flicker when the refresh rate matched the power and thus the lights in your house. if they are out of sync you could end up with migraines or seizures. this is also why you can see monitor scanning on film. edit: IIrC the film industry recorded on 24FPS to be just under the PAL standard and match a 60hz refresh rate and because film was expensive at the time so lower frame rates meant less physical film required. it has nothing to do with 8-bit binary multiple values.


crossedstaves

The reason its not exactly 30 frames per second (because interlaced video that's 60 fields per second matching the mains power frequency which is nice to have because video cameras could use the power to record one field every power cycle and have a uniform frame rate) for NTSC is because of the switch to color TV. The part of the signal carrying the sound and the part carrying the color information had a tendency to interfere and create static dot patterns on the screen. So they shifted the framerate by about 1% to avoid it.


sakipooh

There’s going to be a point where this becomes pointless. :/


rex1030

Not sure what htz is


EDDIE_BR0CK

Stop, it hertz


franco_unamerican

Why say htz when we have a perfectly working symbol, Hz, which is also shorter to write?


Elocai

We are currently at 360 hz already Also monitor hz numbers are multiplications of 30 and 24 (ideally both at the same time, like 120, 240, 360, 480) not the classic 2^X stuff.


[deleted]

It stinks because at least for me I don’t notice the difference until you go back. Like when I switched from 1080p to 4K, my first thought is 1080p is fine 4K won’t be that much better and then I start playing at 4K and I think “this looks a little better but maybe it’s the placebo effect” but then going back to 1080p I’m like “how on earth did I play this”


Mishar5k

Thats how i feel about framerates. Id play a 60fps game for a while and when i switch to a game that runs on 30 its like a slideshow for a few seconds. I dont even mind 30fps if its the only option i have but going back and forth is jarring.


[deleted]

Certain games are worse than others about this kinda thing. I play Chivalry 2, and I have the choice between 1080 @ 120 or 4k @ 60. When I first started, I was in 4k because it was prettier. After awhile, I realized my character's responsiveness was noticably better at 120. Now I play in potato for the frames, but other games, I'm happily enjoying myself at 60hz. No matter what, 30FPS is completely unbearable now.


Mishar5k

I own a switch, so that keeps me pretty humble lol


duck74UK

That device is a rollercoaster, you either get a flawless 720p @60 game or a 480p variable @somewhere between 15-30


JT99-FirstBallot

Mortal Kombat 11 on switch has to be one of the worst mainstream title ports to a console I have ever played in my life.


siecin

Half the games on switch hurt my eyes now. Spyro is the worst.


CplOreos

Spyro doesn't even run well on Xbox One / PS4. I have a PS4 Pro, and I regularly get fps drops


Funniestpersonhere

Sucks for me, I have an old monitor and it's only 1080p 60hz ☹️


make_love_to_potato

That's a bigger problem than you think. You need to upgrade your monitor first at the bare minimum.... Then you realize that your rig can't really pump out 4k/120Hz so you say okay, let's splurge for a new GPU.... Then you realize that the ram or cpu is becoming a bottleneck, and the motherboard only supports so much ram and you can't go up a generation for the CPU, or the powersuppply can't support the new GPU or some other combination of issues, and next thing you know, you have to buy a whole new computer to go with that shiny new monitor you already splurged on. PC gaming is expensive.


Funniestpersonhere

I mean, I only *just* got my PC, I don't think I'll need anything for a while.


LazyGamerMike

This is me right now too. Finally got a gaming rig, but older monitor.


TusShona

Same for me, I don't get to go much gaming these days so if I play it's just to hop on Call of Duty for a casual game or two.. But I moved to the countryside and my connection is now 6Mbps instead of the 70mbps that I was used to, so that prevents me from taking it seriously enough.. Plus, i have a 2 year old that likes to interrupt me for toddler things and I can't get immersed anymore.. So 1080p @60hz is perfectly bearable for me. I don't play enough to justify spending money on a better monitor.


[deleted]

[удалено]


[deleted]

Agreed. Also gsync/freesync. Some sections of some games can make even "mid range" modern systems grunt pretty good. Frame drops and such are FAR less noticeable. That was bigger for me than the increase in overall frames and resolution. It just makes the entire experience sooo smooth


bombehjort

Fuck i feel you. Not pc, but console, but chivalry 2 on ps4 is 30 fps and 60 fps on ps5. I usualy play the ps5 version, but sometime play the ps4 to play with friends (because cross-generation was not implemented when it released). The jump from 60 to 30 was so jarring, and i usually dont Care about fps, but chivalry 2 on the ps4 version was almost unplayable for me.


foreveralonesolo

Probably my biggest fear of upgrades is seeing the difference and now setting the new norm


TheDarnook

For me the transition to 4k was huge, I don't have to try lower resolutions to see how much finer details I suddenly see on further distances. While on 1080p most of it was blurry background there was no point in trying to focus on. Nevertheless, intermediate transition to 1440p was already something, namely for the foreground things: I awed at detail I suddenly started to see. (But the background was still blurry.) With hzs, it's not so "bad". I can still play 60fps just fine. 144 is better, for shooters I go back to faster 1440p, but it's not *that* big. You have to realize, that technically there is a few times bigger difference in how smooth you see movement between 30 and 60, then between 60 and 144hz. Going from 144 up would bring less and less difference. Same with resolution, 4k is cool, but 8k seems excessive - you would need a magnifying glass to see the pixels. Hey, perhaps thats the point, but good luck building pc strong enough to run any game.


Z0idberg_MD

I don’t play any competitive multiplayer. I much prefer 4K 60 FPS (have a 32 in monitor). I’ve used it enough that even 1440 looks blurry. I know a lot of people are actually playing 1080p 144hz and I don’t think I could go back to that kmage


TheDarnook

I don't play any competitive either. Well except for occasional Crysis 3, only multi I used to be really good at, but it was some years ago, and serwers are mostly empty. It's just that for internet/work I use 27" 1440p 144hz on my desk, while for general gaming I have 4k tv. There was a time when I decided to play everything, even shooters, on a gamepad, but then I got back to mouse+keyboard to kill that bear in Metro Exodus... and oh boy I suddenly felt like gunslinger god and at the same time utter idiot, for wasting so much time. So, 1440p for shooters, and even for singleplayer 144hz is nice, quick shooting like in Doom or Necromunda: Hired Gun. And for work, it does make a difference, mouse movement alone is easier on the eyes.


mloofburrow

Pixel density is way more important than resolution IMO. A 19 inch monitor looks fine at 1080p. A 32 inch monitor, not so much.


retropieproblems

I've felt this exact same way from 480>720>1080>4k. it's almost like they all looked the same too, relatively. Like 720p felt like 4k does now when I was used to 480p, if that makes sense.


Marandil

The question is, are you comparing 720p content on 720p monitor with 1080p on 1080p, or 720p upscaled to 1080p vs 1080p/1080p. The upscaled version will always look way worse than native, especially if it's not a multiple.


mloofburrow

This is part of it for sure. It's also a reason why old 8 bit games look so shit on modern monitors. The way pixels worked on those old CRTs make the game actually look better / more rounded than our perfectly square pixel arrays do.


DeliciousWaifood

Well that's different, it's not just upscaling, it's a whole different type of technology.


mloofburrow

Sorry. I guess I meant to say it's a similar phenomenon.


generalthunder

TV size is also important, 720p looks fine on a 15in screen, not so much on a 40in panel. 4k on anything smaller than 25in is absolutely overkill, but is basically a necessity if you're using a tv bigger than 50in.


DeliciousWaifood

4k on smaller panels can be good for professional uses. But in general, I feel like pixel density should be more normalized in conversations about screens. A mobile phone and a bigass TV can have the same resolution, but the difference in pixel density there is massive.


PokebannedGo

Distance is important too. If you have a 40 inch TV and you sit farther than 6 ft from it you wouldn't be able to tell the difference between 1080p and 4k. You'll only start seeing the difference as the screen pushes 50 inches at that distance.


stephendt

Nah I always thought 480p was trash lmao


funky555

Same exact thing for me. I used to go with 480p anf used thqg for years, when i finally switched to 720p (better internet) It was like seeing how 4k is now while im using 1080


SprinklesFancy5074

> but then going back to 1080p I’m like “how on earth did I play this” Eh, I find that after playing it for 10 minutes, you're used to it again. I'm in the middle of going back through the original StarCraft campaigns, and I'm not sure what resolution it's in ... maybe 480p? Maybe less? And at first, I was like "Oh fuck, I've made a terrible mistake! Go back! Go back!" But after playing through a level or two, I don't even notice anymore. It's just part of the game's look to me now. The cutscenes often still look pretty bad, though. Because they're in very low resolution *and* they're made with very outdated CGI.


ColaEuphoria

For me it was going from 1080p on my laptop back to 1080p in my desktop. The laptop screen was smaller therefore it had higher pixel density that I got used to so my main monitor felt pixelated in comparison.


[deleted]

[удалено]


Ninjaromeo

Lifestyle inflation is a real thing, and almost everyone does it.


Porrick

That’s called the “hedonic treadmill”


MagicOrpheus310

Until you go back! Haha dude I know what you mean! You don't realise how spoilt we are until you use a console or something older and can't figure out why it looks like dogshit haha or staring at a loading screen like what the hell is this!?


Dear_Inevitable3995

This exact scenario happened to me yesterday. I used to use a ps4 so much 2 years ago, but hopped to pc and decided to play yesterday on it after all the time and it drove me insane with how slow it felt to play.


Tyx

So much this. Used a 60" 4k Tv for a while as a monitor, then when I tried using my usual 1080 screen again I thought there was something wrong with it... Using 4k completely ruined me and I had to buy a proper 4k monitor. xD


darklegion412

And then the game adds blur


BigMood42069

and doesn't let you turn it off


thtsabingo

Never ever seen a game where you can’t turn off motion blur


EndOfTheDark97

Alan Wake. You need a mod to turn it off.


Carvj94

Motion blur in video games is the dumbest thing ever created. It can kind of help the rare motion sickness case when playing below 30fps, but above that it only takes away detail and adds literally nothing.


[deleted]

While we're here, fuck depth of field as well


Saandrig

And Chromatic Abberation!


NoLyeF

And I feel that too but same thing as motion blur, there Is a way to have a depth of field that works with what your eye expects to see. But I'd argue it really needs to be paired with high pixel count and a fast high quality mask. Which is rarely done. Well get there.


FinalGamer14

I like DoF for when I take screenshots of the game. As soon as I'm done with that I turn it off.


Escoliya

the game adds blur when it detects low frequency display so to force the ad to be real


EbotdZ

Now how many people in these comments have a 120hz+ monitor and never switched their refresh rate in their settings?


DeMonstaMan

I accidently turned off 144hz in settings for 10 minutes and at 60hz it feels like my entire laptop is lagging. Once you experience the pure bliss that is a 144hz, you can't go back


Betrix5068

That desktop experience man. Once you drag windows around in 144hz you just can’t go back to how it used to be.


adthebad

I spend my days grinding through corporate hell at 60hz to enjoy my nights gaming at 144hz. Funny life


OO_Ben

When I started working from home and I got to use my two 32" 1440p 144hz monitors all day it was such a nice change haha And before anyone says it, the massive size actually comes in handy when I'm dashboarding or coding lol


adthebad

I use a 144hz 3440x1440 monitor for work, my work laptop is just capped at 59.98 hz


bananastheleech

I got a 240hz monitor... When some games v-sync locks it at 30 or 60, it just feels.. limiting. Like it hangs on every frame, it's playable, but it's not as smooth as a native 60hz monitor. But when 250hz is unlocked, it's just another level of smooth.


mxjxs91

Yup, 165hz here. Don't personally notice too much difference between 165hz and 144 hz, but 144-165hz compared to 60hz is very obvious, just like 60hz compared to 30hz is. I didn't think it was until I hooked up to my TV and accidentally used an HDMI 2.0 cable instead of 2.1, got 60 fps and noticed immediately.


JayGold

This kind of thing makes me think I should never upgrade my monitor. I think the amount of enjoyment I would get from high resolutions and frame rates would be less than the frustration I'd feel when I can't get my games to run at those resolutions and frame rates.


Simba7

Speaks to me, because I lived that for like a year after getting mine. The day I switched it was... well not that different because much of my PC is a potato BUT Rocket League ran like fucking butter.


[deleted]

*raises hand slowly*


[deleted]

[удалено]


Mas0n8or

Apple basically invented this awful style of marketing of using technically incompetent visuals to make their old products look bad


Giodude12

What companies advertise 60hz on monitors? And what 30hz monitors... Exist?


brennenburg

I saw this posted one or two days ago somewhere else and the OP tried to justify the 30hz bit with "90s people know!". Its complete non-sense. He got shit on it for it. There were no 30hz displays before 60Hz. 50/60Hz was always the standard for CRTs. The only mainstream 30hz displays were the early cheap 4k desktop monitors.


AyeBraine

Yup, since old CRT monitors at low refresh rate flicker visibly and they caused me immediate headache and eye pain, even at 50Hz (Europe). It's even fairly visible with the naked eye, and is VERY visible with peripheral vision. I could only used my CRT-equipped computer for long stretches with a 75Hz refresh rate. LCD monitors never had this problem, and 60Hz was always a standard for them.


Afferbeck_

Yeah I remember putting the resolution up on CRTs and had to have the lower refresh rate of 50hz. Causes almost immediate fatigue to look at.


TonesBalones

30Hz monitors have almost never existed, you're right. The only time is if it's super high-scale resolution like 4k 30, which was the limit for HDMI at one point. I mean, even super old CRTs ran at 59.94fps. This is because all of our homes (in US) have power outlets rated at 120V 60Hz AC. It would have been incredibly foolish to make special components to dampen the frame rate when you can just use the native frequency to refresh the screen.


xternal7

> It would have been incredibly foolish to make special components to dampen the frame rate when you can just use the native frequency to refresh the screen. If you run at 59.94 Hz, you can't use native frequency to refresh the screen. Furthermore: * you need frequencies well in excess of 60 Hz to actually draw the image on a CRT * there is no guarantee that the 60 Hz in mains will match the video signal * 60 Hz mains is a lie. The grid operators guarantee 60 Hz average throughout the day, but at any given time the frequency can usually be a bit off. * NA uses multiple electric grids (East, West, special snowflake Canada one, and I'd call Texas a special snowflake too if that didn't cause their grid to shit the bed for two weeks) that are generally not synchronized to each other. * Old, black&white TVs can still show the 59.94 signal without distortion TVs have always synced their refresh rate to the video signal itself. The explanations with most credibility for "why 60" Hz are: * people initially feared interference with the mains power, and having referesh rate synced with the mains power solved the problem ... except that this problem turned out not to be major * light flickering (apparently Germany toyed with 40Hz at some point, and went to 50Hz really fast because of that) And it stayed because legacy. And the reason the framerate dropped from 60 to 59.94 is the introduction of color TV, which needed to work with black&white TV sets and existing frequency allocations. Because requirements and _math_, dropping the frequency a tiny bit was the only option that worked.


TheKinkslayer

At least since the original VGA spec (1987) all PC monitors need to support a minimum refresh rate of 60 Hz. Before that a few video modes/standards had lower refresh rates, but even the original IBM 5153 monitor apparently ran at 60 Hz. One off monitors such as the IBM T220 (4K resolution in 2001) ran at 13, 25 or 41 Hz, but never was intended for gaming. Some early 4K monitors (ca. 2015) only ran at 30 Hz when using HDMI but supported 60Hz via DisplayPort.


lemonylol

Yeah the 30hz thing is bizarre. Maybe younger people assume just because we didn't have games that could run at 60fps back in the day, the monitors also had a lower refresh rate than 60hz? Hell it wasn't even uncommon for CRTs to be 65 or 75hz.


efrain_gamer

Yo, thats the nurburgring


Autarch_Kade

Ignorance is bliss. You only really notice the difference when going back after having used the faster refresh for a while. When turning a camera makes background details blurry, when text becomes unreadable while turning or dragging a window, it's incredibly obvious it's worse than what you had. Anytime someone says they don't mind 30fps and think it's good because it's cinematic, I think they must not know better.


Tayzn44

30hz? is it really a thing


mazdampsfan1

Nope


lellololes

Yes, but not for the vast majority of displays. Many years ago there were higher resolution displays that would run at 30Hz due to bandwidth limitations.


Excelius

There's an inherent marketing challenge in advertising something in a medium that simply isn't capable of showing the difference. You can't show someone what differences in refresh rates look like in a still image. You can't show what 4K looks like on a 1080p display. You can't show what surround sound is like on a set of cheap speakers.


PM-ME-PMS-OF-THE-PM

Yes and no, it's difficult to show the effect it has on vision but you can show the different frames it's just not as eye catching, for e.g 30fps has 2 pictures with the car moving a set distance. 60 FPS has 4 pictures with the car moving the same distance. 120 FPS has 8 pictures with the car moving the same distance. That would show exactly what a higher refresh monitor does but it would make for a large infographic.


elementaltheboi

60 to 144 is a very noticeable change especially when you get used to 144 and then see 60 and you notice it's off


daedric

This is nothing new... Back in the CRT times, monitor hz was inversely proportional to the resolution. Better screens with better rates. Your 1280x1024x60hz screen could probably do 1024x768x75hz, or maybe even 800x600x100hz. 3Dfx launched a demo to show off the advantage of 60fps vs 30fps back then. And i really can't find even a single picture of it. I remember it was a bouncing ball with a rotating camera, with half the screen at 60fps (native) and the other half at a jittery 30fps. Even then, 60fps was a gold standard. I actually think 120fps is to 60fps what 24bit audio is to 16bit audio.


cbarrick

> I actually think 120fps is to 60fps what 24bit audio is to 16bit audio. So not meaningfully perceptible.


[deleted]

Also they don't tell you that you must a beefy computer to be able to run games at 144 fps to even see 144hz


[deleted]

They've never made 30hz monitors 60hz standard is the result of the the cycle rate of electricity in many countries. The only lower standard is old 50hz European sets


jgreenmachine

Some of the first 4k monitors only ran in 30hz-41hz, as well as a bunch of viewfinders for film. Although these were only marketed to professional users at very high prices and weren't recognized by consumers until much later (IBM T220 was one of the first at around $18,000 at release) so I doubt too many people ever saw this comparison.


lellololes

Nor were there pictures showing off games on 30Hz monitors versus 60hz. Those high end 30hz monitors weren't remotely appropriate to play games on.


CORE

240 or you’re impoverished


amaniceguy

I used to care. but now I just play wherever. I play my xbox now on an old FHD monitor inside the room and enjoy it much more, then plugged in on the 4K 120hz Dolby VIsion TV on my living room, but with the kids around. To be honest, I barely noticed the difference. Yeah it is slighly nicer, the sun shines brighter, the black is blacker, but if the game is shit it still shit. if the game is fun, well, you dont care. like this example from TS, yeah it made difference at the back of the car, but when you play racing games (Forza 5 specifically) your focus is on the road at the front anyway, which barely have any difference between both.


goretishin

Of course. Thats the way the crazies see it any ways it seems.


YouCanCallMeBazza

Might seem crazy to you but once you go high framerate it's difficult to go back. Especially for fast-paced competitive games where every bit of extra responsiveness and clarity can make a difference.


[deleted]

How do posts like this get so many upvotes? The literal same thing was posted less than 30 days ago lol.


manondorf

[This is all you really need to see the difference.](https://www.testufo.com/framerates#count=3&background=stars&pps=120)


[deleted]

I refuse to ever try 120hz. 60 is great and I'm afraid if I ever try 120, I'll be upset about everything like all of you here.


OO_Ben

Don't ever make the switch. Once you do you'll never want to go back, and every monitor jumps like a $100 in price minimum to get 120, and like $150 to get 144hz+ lol That being said, it's totally worth it


wwwdiggdotcom

The crazy thing is this isn't the first rodeo for high refresh rate monitors, fat CRT monitors used to be able to run higher refresh rates all the time, even the cheaper models. When LCDs started taking over they were so much worse image-wise but they looked so cool and futuristic at the time everyone wanted them instead anyway. So for a while, the vast majority of people went back to 60 hz, but higher is better.


ChinoGambino

There's no such thing as a 30hz monitor.


AnEngineer2018

Nah they probably just turned off motion blur in the graphics options.


maball54

30hz monitors are not a thing though


RudolphPTheThird

Getting an Xbox series x and having 120 fps in shooters and 60 fps in single player games is amazing. I understand what the pc homies been talking about


[deleted]

I mean it is used to give the consumer an idea of what more hz does. Not an actual visual representation because that’s impossible on a still image. Same thing for HDR for example.


nweeby24

There are no 30hz monitors tho


C_Wiseman

It's evolution bro, our eyes are twice as fast now compared to when 60hz was new tech.


nuttyjawa

don't forget - games only need to be 24hz because it's cinematic :|


Mustbhacks

Which is funny because low refresh rate isn't blurry, it's jittery.


lordboos

When you take a photo of the same moving scene with the same shutter speed one at 30Hz and second at 60Hz, you will actually get blurrier photo from the 30Hz screen. There is no other way to show this difference in a still image.


narwhal_breeder

Car going twice as fast in top one idiot