Jump to content

Frame rates and human perception


Recommended Posts

[Moderator note:  This discussion was originally split from another topic here, which was about KSP 2 and resources consumption on the PC.  This resulted in a lengthy and interesting discussion that didn't really pertain to the original topic, so we've split it off into a thread of its own, here.]

On 7/23/2021 at 5:13 PM, Bej Kerman said:

60fps+ will be supported better in the future but it's not like human senses will be upgraded to notice the change better. 60fps+ will be overwhelmingly pointless for the time being.

Have you ever played at 144fps? 95 fps? It's literally night and day difference between 60 fps. 

Our senses are a limit, but it's not at 60fps or even 240 lel

Edited by Snark
Split by moderator from another thread
Link to comment
Share on other sites

28 minutes ago, Incarnation of Chaos said:

Have you ever played at 144fps? 95 fps? It's literally night and day difference between 60 fps. 

Our senses are a limit, but it's not at 60fps or even 240 lel

On the 80s, Americans used to comply when watching TV on Europe, as they were used to 60Hz and on Europe they used 50Hz as the frame refresh rate - and the flickering was annoying.

But the 60Hz also flickers the same - only on a faster pace.

So what was the problem? Perception.

After some time, people got used to the frame rate and stopped to perceive it.

You see differces between 60 and 144 fps because you want to see such differences. Once you start to pay attention to the game itself,  you stop to perceive it.

There's a reason most movies are still at 24fps nowadays.

TLDR: 60fps or even 120 fps are better technically, but not necessarily desirable. Most people get more satisfaction at current 24fps, and that ends up settling the matter.

 

 

Link to comment
Share on other sites

1 hour ago, Lisias said:

On the 80s, Americans used to comply when watching TV on Europe, as they were used to 60Hz and on Europe they used 50Hz as the frame refresh rate - and the flickering was annoying.

But the 60Hz also flickers the same - only on a faster pace.

So what was the problem? Perception.

After some time, people got used to the frame rate and stopped to perceive it.

You see differces between 60 and 144 fps because you want to see such differences. Once you start to pay attention to the game itself,  you stop to perceive it.

There's a reason most movies are still at 24fps nowadays.

TLDR: 60fps or even 120 fps are better technically, but not necessarily desirable. Most people get more satisfaction at current 24fps, and that ends up settling the matter.

Comparing movie frame rates and game frame rates is a bad idea. Frame rates in movies are there largely for stylistic reasons. When movies have a low frame rate they show streaking which helps us perceive intended fast motion like in car chases. In video games you just have a still crisp frame sit there for longer which is why motion blur needs to be added in as an effect. 

Also you are comparing a difference between 50 and 60 Hz (20% rate increase), which was apparently noticeable, and using that experience to confirm assertions about a difference of comfort between 60 and 144hz(140% rate increase)...

Also, you aren't controlling the camera in a movie..  It is not a "you chose to perceive it" type deal, if you play a fast paced game then 24Hz is flat out unacceptable.

Edited by mcwaffles2003
Link to comment
Share on other sites

21 minutes ago, mcwaffles2003 said:

@Incarnation of Chaos Why do you keep quoting me while leaving the quote blank?

Because mobile sucks

1 hour ago, Lisias said:

On the 80s, Americans used to comply when watching TV on Europe, as they were used to 60Hz and on Europe they used 50Hz as the frame refresh rate - and the flickering was annoying.

But the 60Hz also flickers the same - only on a faster pace.

So what was the problem? Perception.

After some time, people got used to the frame rate and stopped to perceive it.

You see differces between 60 and 144 fps because you want to see such differences. Once you start to pay attention to the game itself,  you stop to perceive it.

There's a reason most movies are still at 24fps nowadays.

TLDR: 60fps or even 120 fps are better technically, but not necessarily desirable. Most people get more satisfaction at current 24fps, and that ends up settling the matter.

 

 

Play a game at 24 fps, then 60. 

And no, I don't just see it because I want to. I literally have gone back to 30 and 60 fps, and the difference is incredible.

Everything is slower, animations chug, input lag is oppressive. Again, you can read about this all day but until you actually play with higher refresh rates you won't understand what I'm saying in the slightest.

Also the refresh rates of TV have far more to do with the amount of data you can push over a given frequency than anything else. 50 hz is less information, so less wattage at the station for the same transmission.

Edited by Incarnation of Chaos
Link to comment
Share on other sites

8 hours ago, mcwaffles2003 said:

Also, you aren't controlling the camera in a movie..  It is not a "you chose to perceive it" type deal, if you play a fast paced game then 24Hz is flat out unacceptable.

O didn't said 24fps is acceptable for games, I said it's still preferable for movies. People's perception was the core of the argument.

 

7 hours ago, Incarnation of Chaos said:

Because mobile sucks

agreed on all accounts!

 

7 hours ago, Incarnation of Chaos said:

Everything is slower, animations chug, input lag is oppressive.

I think we are not on the same page anymore? Low fps don't make things slower, make them 'chumkier'.

An object that crosses the screen from on side to another in 2 secs will do it in 2 secs no matter.bow many frames it takes - 50, 100, 120 or 288.

About the input, you don't need to attach the input to the framerate. Doing input on  the critical path of a time critical computing is far from being a good idea a anyway. Every single academic text I ever read about real time computing advocates againt doing I/O on time critical code.

And, in a way of another, the real bottleneck for a smooth animation is the Monitor's refresh rate. 4k monitors do 60 to 70Hz, and most affordable 1080p oned like mine does 75Hz. So stressing the GPU beyond that just don't improve the experience.

Not to mention studies about human vision stating that we can process images as fast as 13ms (from 80 to 13, as I read) and so Monitors faster than 75Hz are just not helpful - 1000 / 75 is 13.33333, the most common limit for the brain to process human vision.

I think that 75Hz with vsync is probably the same experience as a 144Hz image. But 'cheaper'.

Edited by Lisias
Tyops! surprised?
Link to comment
Share on other sites

6 hours ago, Lisias said:

O didn't said 24fps is acceptable for games, I said it's still preferable for movies. People's perception was the core of the argument

None the less the argument still fails since having a low frame rate changes how each frame looks for a movie but in videogames it simply makes your motion feel sluggish. Now if you are playing a slow paced game or top down game like KSP, Rimworld, Cities: Skylines, etc it can be argued that 60Hz is enough (it is for me, these are the games I play typically and I use a 60Hz TV for a monitor). That said, in the case of a game where you are reflexively moving the mouse and/or are in a first person perspective like any competitive FPS, fighting game, or even casual RPG in the first person perspective high refresh rates beyond 60Hz make significant differences, not just in appearance as you seem to only allude to, but in feel. Lower framerates will make you genuinely feel sluggish, full stop. There is a reason it is known that VR headsets with framerates as low as 60Hz or lower can commonly cause severe motion sickness.

Also...

6 hours ago, Lisias said:

1000 / 75 is 13.33333, the most common limit for the brain to process human vision

Our brains aren't digital computers and we dont have simple refresh rates

6 hours ago, Lisias said:

I think that 75Hz with vsync is probably the same experience as a 144Hz image. But 'cheaper'.

It's not.

Link to comment
Share on other sites

5 hours ago, mcwaffles2003 said:

None the less the argument still fails since having a low frame rate changes how each frame looks for a movie

Nope. The argument is exactly about how low framerates looks on the movie, and yet people still prefers it by reasons out of scope of this discussion.

Being "technically better" doesn't automatically improves the experience.

 

5 hours ago, mcwaffles2003 said:

but in videogames it simply makes your motion feel sluggish. Now if you are playing a slow paced game or top down game like KSP, Rimworld, Cities: Skylines, etc it can be argued that 60Hz is enough (it is for me, these are the games I play typically and I use a 60Hz TV for a monitor).

Nope. Your brain takes at least 13ms to process an image. Then it needs time to propagate the muscle's commands from the brain, another 16 to 25ms. So you have something like 26 to 51ms between the stimulus and the reaction (something between 40 to 20 Hz!!).

So, you see, we have a significative lag between what we see and what we do about what we see.

Considering that we perceive visual clues way faster than we react to them, 75Hz appears to be the best compromise.

 

5 hours ago, mcwaffles2003 said:

That said, in the case of a game where you are reflexively moving the mouse and/or are in a first person perspective like any competitive FPS, fighting game, or even casual RPG in the first person perspective high refresh rates beyond 60Hz make significant differences, not just in appearance as you seem to only allude to, but in feel.

Not exactly. What's happening here is that the input controls are (wrongly on my opinion) tied to the framerate,

Since we have a delay of about 26 to 51ms between the perception of the stimuli and the reaction, and the stimuli themselves takes down to 13ms to be processed, we have an additional 13 ms to confirm the results of that reaction.

Let's assume a very responsive youngman whose reaction times are on the top of the species: 13ms to process the visual stimuli, and 13 ms to propagate the synapses into muscle's signal receptors. 26ms total, or 2 frames at 75Hz. You need an extra frame to process the results of what you did, so between the original stimuli and the processing of the feedback, you "waste" 3 frames and the cycle ends at the 4th frame (remember Nyquist - anything happening between two sampling points are detected on the second sampling point, unless it's duration is less than the delta-T of the sampling points when it is plain lost).

On a faster framerate with the inputs attached to the frame, the computer will be able to show you the feedback earlier on that pipeline. Not because of the higher framerate (because you don't see anything below 13ms anyway), but because the processing of your reaction is on the critical path of the frame pipeline, what delays the resulting frame for you - and so fastening the whole pipeline ends up accelerating showing the resulting frame to you.

So, yeah, on a system where the input is attached to the frame, going 120 or 150Hz will improve perception because the processing of the resulting frame will be ready before you can perceive a frame. But this is happening not because doubling the framerate is perceptible by you, it's happening because someone tied the input and the respective processing to the frame's critical path, delaying the results to you.

Had the computer be programmed to receive the input and process the reaction at it maximum speed instead of be forced to pace down by the frame rate, you could have the feedback displayed already on the start of the 3rd frame from the stimulus that triggered the chain process.

 

5 hours ago, mcwaffles2003 said:

but in feel.

You can't "feel" anything faster than 13ms. Feel :P free to provide any material supporting your argument.

Here follows mine:

 

5 hours ago, mcwaffles2003 said:

There is a reason it is known that VR headsets with framerates as low as 60Hz or lower can commonly cause severe motion sickness.

I don't know how VR headsets works nowadays, but from the days we were toying with it (we were using Nintendo Power Gloves and LCD shutter glasses both tied to a parallel port), the problem is that each eye receives half the framerate, as the computer needs to draw two frames instead of one, one for each eye. So the 60Hz CRT monitors we were used to have would render 30Hz for each eye tops, and people using more expensive 90Hz monitors reported better results for fast pacing animation, not to mention way less strain on the eyes

(Anyone played Sega Master System with a 3D glasses? I did, lots of fun - but could not play it for too much time! :) )

In a way or another, the sickness is reported to be related to the disruption between the aparent movement shown in the headset and the lack of real movement reported by your body. This is disruptive for your brain.

 

5 hours ago, mcwaffles2003 said:

Our brains aren't digital computers and we dont have simple refresh rates

Our brains are biological/chemical computers and the Laws of Physics still applies. We have limits on our sampling rates we use to perceive reality.

 

5 hours ago, mcwaffles2003 said:

It's not.

Sources, please.

Edited by Lisias
tyops as usulla...
Link to comment
Share on other sites

3 hours ago, Lisias said:

I don't know how VR headsets works nowadays, but from the days we were toying with it (we were using Nintendo Power Gloves and LCD shutter glasses both tied to a parallel port), the problem is that each eye receives half the framerate, as the computer needs to draw two frames instead of one, one for each eye. So the 60Hz CRT monitors we were used to have would render 30Hz for each eye tops, and people using more expensive 90Hz monitors reported better results for fast pacing animation, not to mention way less strain on the eyes

Nope, modern headsets have the full framerate on both eyes.

3 hours ago, Lisias said:

In a way or another, the sickness is reported to be related to the disruption between the aparent movement shown in the headset and the lack of real movement reported by your body. This is disruptive for your brain.

Motion sickness it's much more subtle than that, it can sprout from any discrepancy between what you feel and what you see and a lower framerate is known to cause it. I'm lucky, I don't suffer from any form of motion sickness no matter how hard I tried in VR in over 1500 hours I only felt a little nauseous once in Elite Dangerous when rolling down a hill in a Scarab at low gravity while spinning.

But I can totally notice if I forgot to set the framerate back to 120, the difference in comfort and eye strain is real.

What you lack to understand is that the human eye is not a digital camera and it makes no sense to treat it like one.  

Link to comment
Share on other sites

7 hours ago, Master39 said:

What you lack to understand is that the human eye is not a digital camera and it makes no sense to treat it like one.  

Apparently my lack of understanding is pervasive in the scientific community.. :)

Spoiler

Persistence of Vision is a thing. :) 

On the other hand, I learnt that the minimum FPS for preventing headaches (that could be severe to the point of causing nausea) is 90Hz per eye.

Appears to be related more to eye strain and eye fatigue than perception though.

 

Edited by Lisias
MOAR INFO
Link to comment
Share on other sites

12 hours ago, Lisias said:

Apparently my lack of understanding is pervasive in the scientific community.. :)

  Reveal hidden contents

Persistence of Vision is a thing. :) 

On the other hand, I learnt that the minimum FPS for preventing headaches (that could be severe to the point of causing nausea) is 90Hz per eye.

Appears to be related more to eye strain and eye fatigue than perception though.

 

Both of your sources doesn't say anything about frame rates or perception of higher refresh rates or anything that implies anything about the argument at hand here, and the fact that you need to have higher than a specific framerate to avoid adverse effects in a VR headset doesn't match your thesis of the human eye being unable to see anything above 60 Hz.

[snip]

Edited by Snark
Redacted by moderator
Link to comment
Share on other sites

Standard 35mm film may be shot at 24fps, but each frame was shown 2 or later 3 times to give sort-of a 48fps or 72fps experience.

https://en.wikipedia.org/wiki/Frame_rate

How this is done now is a bit more complex.

https://en.wikipedia.org/wiki/Three-two_pull_down

Now that most cinema projection is digital, I imagine there has been and will be changes to drive this from shooting to projecting at a higher rate.

Link to comment
Share on other sites

5 hours ago, Jacke said:

Standard 35mm film may be shot at 24fps, but each frame was shown 2 or later 3 times to give sort-of a 48fps or 72fps experience.

Pretty sure this practice is done not to make the movie being watched to appear as it is at a higher frame rate since flashing the same image 3 times to appear smoother but instead to reduce flicker from the bright light turning on and off which is very noticeable at 24 Hz. This was the next thing that came after what you quoted in that wiki.

5 hours ago, Jacke said:

How this is done now is a bit more complex.

https://en.wikipedia.org/wiki/Three-two_pull_down

Pretty sure this is more about effectively transferring 24Hz film to a 30Hz (29.997Hz) broadcast. Funny thing, back in the early years of television they used to broadcast things to TV by broadcasting a camera watching a recording on a TV. Unless it was live of course.

5 hours ago, Jacke said:

Now that most cinema projection is digital, I imagine there has been and will be changes to drive this from shooting to projecting at a higher rate.

The frame rate being higher doesn't necessarily improve the movie but just offers another stylistic choice.

Some artists are very opinionated about the topic, I found this video entertaining and educational, its about a similar topic.

Spoiler

 

 

[snip]

The article linked to me wasn't about perceiving smoothness but instead reacting to stimuli which is more an argument about input lag than refresh rates...

22 hours ago, Lisias said:

You can't "feel" anything faster than 13ms. Feel :P free to provide any material supporting your argument.

Here follows mine:

 

Quote

The fastest rate at which humans appear to be able to process incoming visual stimuli is about 13 ms. Receiving a stream of data faster than this will only underscore the limits of our perception

"feel" != "react to"

From the linked article:

Quote

New studies show that humans can interpret visual queues seen for as little as 13 ms (about 1 in 75 frames per second).

Keyword: "interpret" as in your brain reacting to the visual feed it is receiving. Our ability to perceive constant smooth motion goes past this.

Edited by Snark
Redacted by moderator
Link to comment
Share on other sites

5 hours ago, Master39 said:

Both of your sources doesn't say anything about frame rates or perception of higher refresh rates or anything that implies anything about the argument at hand here, and the fact that you need to have higher than a specific framerate to avoid adverse effects in a VR headset doesn't match your thesis of the human eye being unable to see anything above 60 Hz.

Because I'm addressing your objections one at time.

The last link was about how human eyes work, addressing your terrible statement that human eyes would be a kind of special devices completely unrelated to the Law of Physics.

[snip]

 

 

5 hours ago, mcwaffles2003 said:

"feel" != "react to"

Exactly. 

 

5 hours ago, mcwaffles2003 said:

Keyword: "interpret" as in your brain reacting to the visual feed it is receiving. Our ability to perceive constant smooth motion goes past this.

What hints you don't need animations using fps faster than 75Hz, the whole point of the discussion.

One just can't "perceive" anything faster that the brain can process it. Eyes are not autonomous devices, they need feedback from the brain in order to work - for example, to size the iris aperture to cope with the current illumination levels. Why do you think first-responders illuminate the iris with a flashlight to check the brain's responses?

Not to mention the eye's cones themselves (as the rods are not useful on gaming as their "resolution" is too low, se my previous video) also have their limitations. The cones have a point of saturation where they don't work properly, and measuring these timings get us some very interesting data:

Spoiler

tjp0529-0469-f3.jpg

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2270196/

(please note the source of the information)

This article I linked is terribly dense, hard as hell to extract layman understandable data - I took some time to get it), but in essence what it is saying is that once the rods are "overexposed", they took about 13 ms (interesting number) to cease working, and then another 7ms to get back to work.

Monitors (and VR headsets in special) are nothing more than millions of small flashlights pointed to your eyes, by the way.

Not being enough, I found an article where the rods and cones response time are being measured (forget about the brain, now we are talking directly to the photoreceptors on the eyes).

These tests were made using saturation (and using photoreceptors from different animals so they are not directly applicable to humans). But some numbers are consistent with the discussion. Rods saturates after approximately 10 to 15ms (cones saturate way faster), and once they are saturated, they cease to work as expected.

Unrelated to the discussion (we are talking about how fast the brain can perceive visual stimuli, not how slow the stimuli can be before annoyance), I found this interesting article - perhaps this could help to explain why you "perceive" some things differently from what science says:

 

Edited by Snark
Redacted by moderator
Link to comment
Share on other sites

Some content has been redacted and/or removed, due to flaring tempers and people making it personal rather than having a civil debate on the merits of the arguments.

  • Please do not tell other people what to do or what not to do.
  • Please do not make personal remarks.  Address the post, not the poster.

Thank you for your understanding.

Link to comment
Share on other sites

17 hours ago, Lisias said:

This article I linked is terribly dense, hard as hell to extract layman understandable data - I took some time to get it), but in essence what it is saying is that once the rods are "overexposed", they took about 13 ms (interesting number) to cease working, and then another 7ms to get back to work.

Please stop acting like eyes can be looked at like digital cameras. Why do you believe single rod/cone response times prove your point? Do you think that every cone and rod in your eye gathers and processes light at the same time like a camera? If just 2 cones/rods are in opposite phase then bam... possibly150Hz perception... But they aren't in opposite phase either, they're all in an amalgam working entirely separately, nothing like a CMOS or CCD sensor which is why making those kind of comparisons makes no sense.

But don't take my word for it though, why not look at and comment about the sources I provided, that you even asked for, that show and prove quantitatively the exact thing were talking about about instead of pretending we're well versed in the anatomy of the eye? 

I'll even post them again. They all are directly testing gameplay impact of high frame rate monitors in a fairly systematic way.

On 7/25/2021 at 4:19 PM, mcwaffles2003 said:

 

  Reveal hidden contents

 

Just cut to the end for the results if you dont want to watch all 3

I'll even include another link that talks directly about the topic at hand:

Please, enjoy.

Edited by mcwaffles2003
Link to comment
Share on other sites

On 7/25/2021 at 8:30 PM, Master39 said:

Nope, modern headsets have the full framerate on both eyes.

Motion sickness it's much more subtle than that, it can sprout from any discrepancy between what you feel and what you see and a lower framerate is known to cause it. I'm lucky, I don't suffer from any form of motion sickness no matter how hard I tried in VR in over 1500 hours I only felt a little nauseous once in Elite Dangerous when rolling down a hill in a Scarab at low gravity while spinning.

But I can totally notice if I forgot to set the framerate back to 120, the difference in comfort and eye strain is real.

What you lack to understand is that the human eye is not a digital camera and it makes no sense to treat it like one.  

VR is another beast than normal games as even first person games is much more detached. With VR the image has to match head movements and you notice if this is off just a but. 
Wonder if it would help to steal from electronic image stabilization. Render an a bit larger image and then show the part who would be visible there you look. 
Benefit here is that the cropping would be done post render at direction you look at the time eliminating the time to render image, Other effects like firing an gun would still have the standard input lag but that would be more like normal games. 

Link to comment
Share on other sites

On 7/24/2021 at 11:39 PM, mcwaffles2003 said:

When movies have a low frame rate they show streaking which helps us perceive intended fast motion like in car chases. 

I'd like to point out that motion blur is an effect of long exposure time of each frame, not of the frame rate per se; you can have very crisp moving objects in a 24fps film.

Link to comment
Share on other sites

2 hours ago, magnemoe said:

VR is another beast than normal games as even first person games is much more detached. With VR the image has to match head movements and you notice if this is off just a but. 

And that's exactly where you can see the framerate and why it matters. You can't really notice the difference between 15 and 150 FPS on a still camera pointing at a wall, but as soon as you move even the difference between 90 and 120 can become apparent. 

In videogames compared to movies you have the camera moving and moving fast to track fast moving objects, in VR that  assumes another dimension since is your own real head moving and you have to keep the discrepancies at a minimum to avoid discomfort that can lead to nausea and thus motion sickness.

This is where the point of this whole argument lies.

2 hours ago, magnemoe said:

Render an a bit larger image and then show the part who would be visible there you look. 

That's basically how reprojection works, as an example I can't always maintain 90 FPS on VR games and I sometimes force reprojection on, put the headset in 120 Hz mode and lock FPSs at 60 and SteamVR works it's "magic" by somewhat distorting every frame to follow the head movement. It's not as comfortable as true 90 Hz but it's still better than fluctuating framerates or 60 Hz. 

 

Link to comment
Share on other sites

1 hour ago, kurja said:

I'd like to point out that motion blur is an effect of long exposure time of each frame, not of the frame rate per se; you can have very crisp moving objects in a 24fps film.

Sure, they will appear crisp if your frame is of a shorter exposure than it is played back but then it will appear more choppy and stuttery. Conversely, if you play lower frame rate recordings on a higher frequency display you end up with the soap opera effect

https://whatis.techtarget.com/definition/soap-opera-effect-motion-interpolation

Link to comment
Share on other sites

3 hours ago, kurja said:

I'd like to point out that motion blur is an effect of long exposure time of each frame, not of the frame rate per se; you can have very crisp moving objects in a 24fps film.

This is true.  What is also true is that the director and cinematographer is careful to film such an image moving at a steady (and often slow) state and without a complex background.  Back in the dawn of 3d acceleration, 3dFX produced a demo showing a complex tower panning in one direction while a globe in the center spun in the opposite direction.  This was then shown at 30Hz and 60Hz.  30Hz was horribly choppy, as your eyes would follow one or the other reasonably smoothly but see the one it isn't "following" as choppy.  60Hz  was more or less fine (this was the days of CRTs and going much higher was pointless).

In practice, this means that flight simulators and some driving simulators are  going to be fine at 30fps.  KSP sounds like a great candidate (and I wouldn't ignore it if you are on a SteamDeck or other system designed for 30fps), but you should be prepared to notice a low framerate when watching your craft descend on a planet/moon.  Games like Fortnight take this to extreme as players might spin while jumping sideways to locate a similarly evasive enemy, then spin around them and attack (so both foreground and background are shaking quickly and independently).

Probably the example of this that sticks in my mind (and claw) is Oblivion (Elder scrolls IV).  At the time it was released, it pushed GPUs hard.  As a more or less sedate RPG, you could easily want to turn as much eye candy as you could and settle down to a nice "cinematic" 30fps.  Unfortunately, Oblivion had some incredibly stupid choices in an otherwise brilliant game (the character leveling system was the worst), and the introductory sequence included a scene panning around the "castle/spire" and we watched at least one tower "parallax" in the midground much faster than the background, ruining the effect in 30fps.

And I'm certain that if you can play it in 30Hz, 60Hz will look fine and you won't need 120Hz.  The stuff that "needs" 60Hz could easily look better higher, assuming your eyes are up to it (I'm old enough to have put a quarter in a pong machine [commercially successful video game], and am shopping for bifocals, I don't expect to see a difference with 120Hz).

https://www.youtube.com/watch?v=dNFB__rSxsQ  (offending bit is at 1:30, with a second less objectionable tower flyby at 1:38 if you don't need a long Patrick Steward monologue).

2 hours ago, Master39 said:

And that's exactly where you can see the framerate and why it matters. You can't really notice the difference between 15 and 150 FPS on a still camera pointing at a wall, but as soon as you move even the difference between 90 and 120 can become apparent. 

No.  If that was all there was,  then movies (24fps) would have moved to 120fps long, long ago, or at least people would watch them on the TV at 60Hz* (50Hz in some countries, but I suspect that may have disappeared with digital) instead of going to the movies (they often do, but the reason has nothing to do with framerate).  The problem is if you have to track two or more objects at the same time.  VR is another issue, as you really want the view to line up with the way your eyes are facing. 

  - * note that since sports are broadcast live, presumably without a director controlling the action, they need >>24fps to handle the scenes.

2 hours ago, Master39 said:

That's basically how reprojection works, as an example I can't always maintain 90 FPS on VR games and I sometimes force reprojection on, put the headset in 120 Hz mode and lock FPSs at 60 and SteamVR works it's "magic" by somewhat distorting every frame to follow the head movement. It's not as comfortable as true 90 Hz but it's still better than fluctuating framerates or 60 Hz. 

Presumably it simply shifts/rotates the 2d images (effectively a "skybox" of 1/60th of a second ago) to where they "should" be in front of your eyes.  I'm convinced that Epic (and Unity) need to do this in the 3d engine so they can rotate a "skybox in 3d" for much greater effect.  Ideally with sprites for non-background objects (like enemies...) that could be at true (or much faster) fps.  Note that this method requires "filler" in some places as it will have edges you weren't seeing around at first and isn't as simple as it first looks.

One of the big issues in wanting higher framerates is that the latency between when the software (typically the game) updates the image and when it appears on the screen.  Typically, the GPU designers "solve" this by throwing higher and higher framerates at the screen.  Likewise, since this creates a desire for higher framerates, one thing GPU designers can do is lengthen the pipeline that creates the image, taking 2-3 frames to finally display the image (I think AMD announced they were dropping from 3 to 2 frames a generation or two ago).

https://perma.cc/4VL7-LS32

This is a bit out of date (2012) but it cuts at the heart of the latency issue.  Micheal Abrash is pretty much a legend in graphics.  First writing how to optimize assembly, then had a stint at Id when they were developing Quake, showed up at Valve to work on VR (Gabe was "always" trying to hire him) and was lastly poached by Facebook as chief scientist at Occulus.

2 hours ago, mcwaffles2003 said:

Sure, they will appear crisp if your frame is of a shorter exposure than it is played back but then it will appear more choppy and stuttery. Conversely, if you play lower frame rate recordings on a higher frequency display you end up with the soap opera effect

https://whatis.techtarget.com/definition/soap-opera-effect-motion-interpolation

If that is happening, turn "game mode" on.   Motion interpolation is used on TVs, and really doesn't belong on monitors.  That said, as long as the TV can handle 60Hz (if this generation of game consoles can do 120Hz, then maybe we will get TVs that can do 120Hz, otherwise not.  Although I'd at least expect freesync soon) and has a "game mode" it can likely make a decent monitor.  I'm using a 43" 4k TV (60Hz) and it is amazing (and roughly the price of a 1080 120Hz monitor).  Just don't expect to hit the windows "maximize" button too often: 43" (or take a square meter and cut the top 1/3 off, that's close to the size) really can only be used full screen in movies and games, otherwise use it like multiple monitors without worrying about bezels.

Link to comment
Share on other sites

I think that past 60 fps you stop perceiving any differences... Probably at this point resolution is more important, but even then that dwindles to a point according to the actual physical size (and the radian size) of the pixels themselves.

I can quite clearly differ between 30 and 60 fps - I think, largely only because of most site preferences, they also include resolution changes.

But I've tried past 60... honestly barely anything changes IMO. Granted I don't play FPS or shooter games or RPG a lot.

 

Maybe if there's anything that changes is that there's an ever so slightly less delay between input and confirmation, but again it takes a while to process. Also motion blur (the interpolation done by the screen) might be what's to be made problematic there.

Edited by YNM
Link to comment
Share on other sites

36 minutes ago, wumpus said:

No.  If that was all there was,  then movies (24fps) would have moved to 120fps long, long ago, or at least people would watch them on the TV at 60Hz* (50Hz in some countries, but I suspect that may have disappeared with digital) instead of going to the movies (they often do, but the reason has nothing to do with framerate).  The problem is if you have to track two or more objects at the same time. VR is another issue, as you really want the view to line up with the way your eyes are facing. 

The thesis here wasn't that 60 FPS are enough for most uses, but that the human brain can't possibly see or perceive any difference whatsoever past 60 FPS. 

I'm not saying that it's worth to go for more than 60, I still play at 60 Hz on a 1080p monitor and I don't plan to change it anytime soon (I spent my fancy monitor money on my Valve Index and now the monitor is at the bottom of the list), I'm just saying that a change in framerate is noticeable and there is a difference between 30, 60 and 120 Hz a difference a human eye can see.

3 hours ago, wumpus said:

While I know the story of how Valve (and especially Abrash) basically kickstarted the whole VR craze (Abrash and Gabe were endorsing Oculus in the very first Kickstarter campaign video) I never got around to actually read some of his blogposts, an interesting read, especially with today's hindsight. 

I think the most relevant bit is when he's talking about how many degrees behind an image is due to the latency of the screen, this:

Quote

Suppose you rotate your head at 60 degrees/second. That sounds fast, but in fact it’s just a slow turn; you are capable of moving your head at hundreds of degrees/second. Also suppose that latency is 50 ms and resolution is 1K x 1K over a 100-degree FOV. Then as your head turns, the virtual images being displayed are based on 50 ms-old data, which means that their positions are off by three degrees, which is wider than your thumb held at arm’s length. Put another way, the object positions are wrong by 30 pixels. Either way, the error is very noticeable.

That doesn't only apply to VR, on the opposite, you snap your mouse-controller field of view in FPSs way faster than you can move your head around, this alone is a good reason why monitors past 60 Hz are desirable for fast paced shooters. This alone is a clear case in which even an untrained eye can see a difference at different framerates.

3 hours ago, wumpus said:

and I wouldn't ignore it if you are on a SteamDeck or other system designed for 30fps

Fun fact, I got a reservation for a December 2021 256 GB model and I totally plan to play KSP1 and 2 on it. I have no problems with 30 FPSs on an handheld.

 

7 minutes ago, YNM said:

Granted I don't play FPS or shooter games or RPG a lot.

That may be very well the reason, and even if you did it depends also on how fast you're used to move in games.

Link to comment
Share on other sites

This thread is quite old. Please consider starting a new thread rather than reviving this one.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...