30 vs 60 vs 120 FPS

In my experience framerate jitter (particularly sudden spikes and drops) disturbs the eye more than a fairly low but very constant framerate.

It’s debatable what exactly “fairly low” and “too much jitter” is, but instead of aiming at something above 60Hz, I’d try to aim at a rock solid 60Hz rate.

If you would shoot a 24FPS Theatrical Movie using a camera that has an exposure of only 1/1000 of a second per frame, the movie would apear choppy.

The reason movies in the cinema run smooth, is that each frame has an exposure of almost 1/24 of a second.
It picks up the change in movement during this time.

Thats the reason computer games appear choppy at even higher FPS,
because their frames represent still snapshots of the scene, not a scene in movement.

eg: the faster your scene changes in a game, the higher the FPS shall be.
A first person game profits more from a higher FPS than a top-down isometric game.

The past TV (CRT) screens also had some natural smoothing between the frames because the former frame faded only slowly and blended with the next. It’s been less sharp this way, but motions appeared smoother, and the screen flickering was less notable.

Computer screens always had less of this smoothing effect, I think intentionally, and needed higher refresh rates.

Ugh, it was like 4 in the morning… I had one of those “SOMEONE’S WRONG ON THE INTERNET”-moments.

True, and also why motion blur in games is so important since it increases the perceived frame rate. We still need at least 60Hz, but 60 FPS with a small amount of motion blur looks a lot smoother. I generally keep it off in shooting games since it’s easier to see details when the game doesn’t have motion blur, but I always leave it enabled in “movie” games like Crysis.

Ironically motion blur is one of those things likely to cost a lot of frames-per-second in the first place…

Cas :slight_smile:

At 1080p:

0.3 ms when screen is 100% stationary (fast path in shader).
1.7 ms when the screen is 100% in motion.

1.7ms is enough to drop you from 60 FPS to 54 FPS, which is still well worth it in my opinion. My motion blur is also far from the fastest implementation, but it’s still easy to tweak by clamping the maximum motion blur radius and the number of samples read for blurring. It’s a feature that runs almost 100% on the GPU (some negligible CPU resources are needed to calculate motion vectors), and it scales with screen resolution since it’s a postprocessing filter. If all else fails, you can just disable it.

It doesn’t matter.

  • Jev

Heh. Maybe on your fancypants GPU, not on my crappy ones :slight_smile: Just turning on distortions on my Intel GPU cuts the framerate in half…

Cas :slight_smile:

On a Galaxy GeForce GT 610. Not that fancy.

As long as your game can run at 60 FPS with no issues, on an average computer, there’s nothing to worry about. 120 FPS is just silly.

  • Jev

I have found that the people who actually care about the perfect 120+fps are gamers. And I mean gamers not your average COD players. This means that the lowest common denominator doesn’t care what the frame rate is as long as the game plays well.

Most people aren’t rocking monitors above 120hz so why bother. It would be nice to have the game compatible for 60+ but they people with hardware to get that are even fewer. Its like those few PC (as we know consoles ain’t hitting 120 anytime soon) gamers that have 120hz and the pimped hardware expect all this fancy stuff when they forget that their a small part of the market.

I think it will be a while before I look into 60+ fps as I don’t have a monitor at above 60hz.

I always think it worthwhile to remember a silly factoid:

[quote]You don’t see the world with your eyes, you see it with your brain.
[/quote]
What this means is that one single feature of the images being shown (in this case framerate) is not enough to grasp how it affects our perception.
Our brain does a lot of post-processing before our consciousness “sees” the image, and much of that post-processing is affected by the contents of the image themselves, which is why we get optical illusions.

This, on one hand, makes studies on the effects of FPS somewhat suspect, specially if they just fixate on the eye, but it also casts doubt on personal experiences of the “I can feel the difference” kind, sometimes most of the time we see what we want to see.

What I’m trying to say is that FPS alone is meaningless without a context. The details of image you are presenting has probably more power over how smooth it is perceived than the number of frames themselves.

Here are some funny examples of how our brain fails miserably at seeing the world.

Are you sure about that? I always thought that CRTs had much faster response times and needed at least 100Hz to avoid flicker.

I think if one day we’ll have VR with working eye-tracking, high frame-rates will be a lot less computationally demanding. Even at proper resolutions for VR (like 4K or even 8K).

Your eyes only see sharply at just about 2 degrees. Outside of that it all quickly becomes a blur. You brain does the rest to make you think it’s all sharp.

At a 100 degrees FoV at 8K resolution, that’s like 8888 pixels per eye that need the best quality rendering, and that’s with 8K resolution (i.e. 76804320) .
Outside of those 88*88 pixels, you can increasingly (and quickly) drop resolution, AA, texture resolution, shader quality etc.
This might be a bit of an oversimplification, but I suppose even today’s PCs should be able to handle that at at least 60fps.
But good eye-tracking will be key here.

[edit]
Furthermore, when you quickly look elsewhere by turning your eyes/head, your brain needs a bit of time to adjust and see things sharply and process what you see. Test it yourself by opening a book in a random page and see how much time it takes between reading a word in the middle of the left page and reading a word in the middle of the right page.
How long did it take? Half a second? That’s like a week in computer time :slight_smile:
So there are possibilities for optimization there as well.

I doubt that will ever happen.
Besides the obvious problem of what to do when you have multiple observers, you’d also need absolutely insane response times for the eye not to register the low resolution/quality peripherial areas when moving focal point.

Bypassing the eye entirely & patching into the optic nerve is the next logical step IMO, though I can’t even begin to imagine how that would feel to the user.

60fps imo should be the standard to aim for, 30 has many drawbacks.

A couple to note would be:

  • FPS drops 1-5 you will notice a heavy stutter
  • FPS below 25 can cause eye strain, this goes with point 1
  • imo large res + 30fps, things such as smoke and particles look sticky

With 60fps you are safe, you have about 20fps worth of wiggle room.

Can’t say much for 120hz monitors, I have a 120hz TV and it is just reealllly weird to watch 1080p movies on it, or games. It is ridicuoulsy smooth, I think standard 60hz monitors play HD back and 25fps, cheap ones.

When a game is about reflexes then 120hz monitor and over 120 fps is nice to have. I play iRacing where difference between 60 and 120 is noticeable, but not significant.

The game should ideally be tuned for perfect reflexes at 60Hz. Also, in multiplayer… it’s a decided advantage to have a 120Hz update rate vs. 60Hz, and thus all players should be capped to 60Hz anyway. And so on.

… but one day we’ll all be running 120Hz.

Cas :slight_smile:

D: The horror!

With VR you only have one observer: The wearer of the VR headset, so no problem there.
Eye tracking is then also very close to the eyes, so it should be possible to make it quite accurate too using just small cameras.
And I think response times of eye tracking might be not as critical as you think. It should be at most the same as the screen’s refresh rate, which is perfectly doable today. And your brain needs a bit of time to adjust to the new FOV anyway.
And what if it takes a bit of time for things to adjust to your new FOV when moving your eyes? It’ll still be better than not adjusting at all.

It seems a lot more practical than ‘patching into the optic nerve’, imho (which seems to be disturbingly close to ‘eXistenZ’ territory :o).
As a matter of fact I think it could be doable within a few years.

IMHO Moore’s law isn’t moving fast enough to get us to high resolutions running at above 60fps, so optimizing to what we actually see and perceive with our brain seems to have a lot of untapped potential to me.