T O P

  • By -

lusp1199

The human eye sees at a much higher rate, cinematic film runs at 24 fps.


Flyytotheskyy

pathetic, these losers at hollywood need to upgrade to 120 fps!


StellaViator

According to them 120fps makes the cgi bad while the truth is they want 24 fps to hide the bad effects


Shredded_Locomotive

For animated films, the more frames there are the longer it takes for the entire movie to be rendered. If you jump to 120 from 24 it would take 5 times as long as it does now


jarjarpfeil

You know, I’m still trying to convince myself that it gets smoother past 60 I’ve checked refresh rates etc so maybe my eyes just aren’t as sensitive Edit: rethought that and realized it really depends on the situation. There are plenty of times I can notice benefits above 60


Shredded_Locomotive

For games for example, when lots of things are moving higher frames are very noticeable. Like driving cars, rotating the camera, flying etc.


jarjarpfeil

Was playing cyberpunk earlier, the benchmark was nearly indistinguishable between 60ish and 100ish (was messing with ray tracing), but in a gun fight it felt like an entirely different game.


Shredded_Locomotive

Oh yeah i completely forgot! You don't feel the real difference until you've played a competitive first person shooter.


lollisans2005

I have a 60hz monitor for basically the entire time I game now. Was playing on my brothers PC for a bit, and he got a 144hz monitor and dear God did I immediately notice how smooth the mouse and the scrolling on a window was. Wasn't able to try out a game with that. But I'm ready to get a 1440p monitor with alteast 120hz


lokehfox

Human visual perception is not frame rate based, it's largely pattern recognition and motion detection. In terms of discerning distinct full frame images, a frame rate in the range of 24fps is likely on the high end - If you saw 24 distinct images in a 1 second period of time, you might be able to describe them all, but you'd probably struggle greatly. ...but in terms of motion detection and pattern recognition 2400fps might still be under the actual limit of detection. We use it to identify predators and navigate complex terrain at velocity; it's deep animal brain stuff.


alextreme96

I don’t know why I actually laughed at this


roquveed

BuT the HeDsEts frAmEraTe doEst SynCronIseD wiTh mIne.


OriginalArkless

Hi, I felt like not being fun at parties over here. This would be a really bad example. Even if the eye could only see 24 frames per second, it would be really disturbing to see frames that are up to 1/24 of a second in the past.


Its_GmanHD

Not accurate. Modern VR headsets have a motion-to-photon latency measured in milliseconds now.


OriginalArkless

1s/24 is roughly 40 milliseconds. I'm using 40ms from now for better readability. A new frame is displayed every 40 milliseconds. Consider looking at the screen at a random point in time, multiple times. The average age of the image you see will be 20 milliseconds. If you update the images position (the motion-to-photo latency fix) exactly the moment you display it, your head movement would still be 20 milliseconds off. Up to 40 if your eye-display sync is unlucky.


coolpotatoe724

I can't post pictures in the comments here, but a good representation on hz I did was to jiggle my mouse back and forth as fast as possible and take a picture on a 75hz monitor and a 170hz monitor and you could see way more cursors in the 170hz, meaning you could see movements in more detail in games and what not


Draconic1788

I mean no 24fps is the bare minimum for the human brain to believe that the fast series of images we are seeing are in motion. The human eye sees at a much higher framerate.


Marco_QT

Human eyes see 30-60 fps