30 vs 60 vs 120 FPS

Important numbers: (Some of these are from memory and may be slightly off)

~24 fps. The human eye sees it as a moving picture rather than a series of frames.
~48 fps. Anything lower will cause strain to your eyes over long period of time.
60 fps. Refresh rate of most monitors. Any more than 60fps on a 60hz monitor is a waste of processing power as you’re writing to the framebuffer multiple times between reads.

I have a monitor overclocked to 90Hz and I can easily see a single frame being dropped, despite the more than usual motion blur my screen has. I had a 144Hz screen for a while which it was a little harder to see a framedrop on, but they sure as hell were there. You’re not speaking for everyone, so please kindly shut up. I’m getting tired of this argument already. It’s on the same level as claiming that everyone needs glasses.

Game logic cycles shouldnt be tied to frame rate. It was fine to do that in the past but with so many different machines and devices, and operating systems that can share the cpu out to other applications at any time they want, it just doesnt work very well anymore.

You will hear a lot of hoopla about the “maximum frame rate an eye can see” is 30-40FPS (and that numbers always changing), but those people don’t have a clue what they’re talking about.

Yes, most recent research shows our eyes seem to process information at about the same at 30-40~ frames per second, but our eyes are NOT in perfect sync with the image we are viewing. If we could sync our eyes absolutely perfectly with our monitors and the frame rate stayed at a perfect, absolutely solid unwaving framerate that matched our eyes (because it’s not constant either) then 30-40~ frames per second would be the smoothest video in the entire universe, we couldn’t tell the difference between it and real life.

But, the reality is, since our eyes process the data at a different rate, different ways and out of sync with the video, we need even more frames to process to keep everything smooth because we (for lack of a better word) need “filler frames”. From the latest/greatest in TV/Monitor tech:

Under 30FPS - Choppy, but if consistent, still “good”. Believe it or not, a lot of movies actually run at around 30FPS. Did you know a standard Bluray disc only runs at 24 frames per second?
60~ - Better for gaming or media that’s frame rate isn’t consistent, giving our eyes the additional data required to process the information to a smooth image.
120~ - Getting to the point our eyes wont be able to tell the difference, there’s so many additional frames that our eyes have plenty of data to translate to our brains.
240~ - They claim this is the absolute top, anything beyond this would just be silly. But the difference between 120 -> 240 are almost unnoticeable, even to the trained professionals. Anyone claiming they can tell the difference between 240 and 480 are just getting a placebo effect. (Quite frankly there’s a good chance thats true between 120 and 240 too)

So I say the “holy grail” frame rate for gaming is 120FPS. But like BurntPizza said, you need a 120Hz monitor to get the full benefit. Although 120FPS on a 60Hz monitor is still better than 60FPS because if you have dips/skips they wont even be visible on a 60Hz monitor unless those dips go below 60FPS.

Regardless though, I would design your games to run at 60FPS, that’s pretty much the standard. Running it at 120FPS takes a ton of extra processing power for a very minor difference.

http://boallen.com/fps-compare.html

As stated, these arguments are pointless. The demo above shows the difference between each FPS. On a 60Hz monitor, you can’t see any noticeable difference between 120 FPS and 60 FPS. For gaming, 60FPS is the standard, but do not go below 30 FPS. Hopefully, this post supplements the posts above…

On my 60Hz monitor, I can always tell a difference between 60 and 30 fps. 30 is playable but it is choppy.

They may have, depending on what rate the game is updating at and how its implemented. A commonly used gameloop is the one where you update at even intervals and then render when you have time over. Let’s say you have an unusually high update rate of 100 Hz (hereafter referred to as UPS, updates per second). Let’s calculate the average input delay of a high-end gamer with proper gaming gear and a computer that can achieve 100 FPS and a low-end “casual” gamer who gets 40 FPS.

The most important difference when it comes to FPS and input delay here is what happens when the FPS drops below the UPS. At 100 UPS the game should update every 10 millisecond. However, at 40 FPS the rendering part of the game loop takes 25 milliseconds by itself, meaning that although the game still updates at 100 UPS, they’re not evenly spread out.

100 FPS and 100 UPS
[update][update][update][update][update]

40 FPS and 100 UPS
[update][update]<------render------->[update][update][update]<------render------->

In general, game updating is often a magnitude or more faster than rendering a frame, so in practice the 40 FPS computer is reading input 2-3 times within a very short period, which effectively means that only the first read has a reasonable chance of catching any new input. The result is that although the game speed and all is constant regardless of FPS, the input reading effectively happens in sync with your framerate, not your update rate. 100 FPS = 0-10ms delay until the input is detected, 40 FPS = 0-25ms.

[tr][td][/td][td]Gamer[/td][td]Causal[/td][/tr]
[tr][td]Mouse/Keyboard USB polling delay[/td][td]1000Hz = average 0.5ms[/td][td]100Hz = average 5ms[/td][/tr]
[tr][td]Average delay until game-loop notices and reads input[/td][td]100FPS = average 5ms[/td][td]40FPS = average 12.5ms[/td][/tr]
[tr][td]Time needed for the GPU to render a frame[/td][td]100FPS = 10ms[/td][td]40FPS = 25ms[/td][/tr]
[tr][td]Monitor update time[/td][td]2ms gaming monitor[/td][td]8ms slow monitor[/td][/tr]
[tr][td]Total[/td][td]17,5[/td][td]50,5[/td][/tr]

In practice, the time needed for the GPU to render a frame is most likely significantly higher due to the GPU lagging behind the CPU to assure that it always has stuff to do. This is also amplified by a lower FPS, since the GPU starts limiting the CPU when either when it’s X frames behind or when it has Y commands queued up. Being 3 frames behind at 100 FPS is only 30ms extra delay, but at 40 FPS we’re talking about 75ms.

Carmack is doing a lot of 120Hz because of the Oculus

He said even just using windows in 120hz with a 120hz monitor the cursor is much smoother, let alone games
anyone who says that 60 is even close to the limit we can see has never really researched this topic and is just repeating hearsay, aka being an idiot.

[quote]240~ - They claim this is the absolute top, anything beyond this would just be silly. But the difference between 120 -> 240 are almost unnoticeable, even to the trained professionals. Anyone claiming they can tell the difference between 240 and 480 are just getting a placebo effect. (Quite frankly there’s a good chance thats true between 120 and 240 too)
[/quote]
We’ll see. Its very easy to test.
Show me something 30fps and something 60fps for 2 seconds, one time, I will always be able to tell them apart. Its night and day. With that I know that 60 and 120 will still be quite noticable and I wouldnt be surprised if there is more room up. But we’ll see.

We’ll see. Its very easy to test.
Show me something 30fps and something 60fps for 2 seconds, one time, I will always be able to tell them apart. Its night and day. With that I know that 60 and 120 will still be quite noticable and I wouldnt be surprised if there is more room up. But we’ll see.
[/quote]
I have a $3000 TV that does 240Hz. Although it’s that MotionFlow 240Hz stuff, because there’s no way to even transmit real 240FPS to a device over HDMI at 1080p, I don’t even think dual DVI can do it. I feel like I can tell the difference between 120hz and 240hz, but it’s negligible. Honestly I feel it’s just the placebo effect.

MotionFlow 240hz only exists because it allows 120Hz 3d playback, and you can see the difference there… even though 3D sent over HDMI at 1080p is only 24Hz/24FPS. DVI on the other hand can break into nice 60FPS 3d though (takes a 120Hz TV to do it though). But really, using my MotionFlow for 240hz is kinda silly. It’s just a byproduct of having MotionFlow 3D at 120Hz.

I don’t know about other games, but take league of legends for example. I play only on 40-70 fps and it really seems choppy. My bro’s pc runs 120-150 fps and it seems so much smoother. Its not all about fps in LoL. The more fps, more responsive controls. You can really feel the difference between 60 and 120 no matter what eye scientists tell you.

The thing about 60 vs 120 is… it’s a law of diminishing returns. 60 is as good as it needs to get for nearly everyone. Although a lot of people will see improvements at 120Hz, they’ll already be happy at 60Hz. The main problem with rendering at 120Hz is that you literally need twice the processing power and fancy hardware, so you’re already limiting your audience drastically.

Cas :slight_smile:

That escalated quickly. I’ve never seen a person so pissed off about a framerate.

Well, you’ve got to give it to him that he said please. Always a gentleman.

In my experience framerate jitter (particularly sudden spikes and drops) disturbs the eye more than a fairly low but very constant framerate.

It’s debatable what exactly “fairly low” and “too much jitter” is, but instead of aiming at something above 60Hz, I’d try to aim at a rock solid 60Hz rate.

If you would shoot a 24FPS Theatrical Movie using a camera that has an exposure of only 1/1000 of a second per frame, the movie would apear choppy.

The reason movies in the cinema run smooth, is that each frame has an exposure of almost 1/24 of a second.
It picks up the change in movement during this time.

Thats the reason computer games appear choppy at even higher FPS,
because their frames represent still snapshots of the scene, not a scene in movement.

eg: the faster your scene changes in a game, the higher the FPS shall be.
A first person game profits more from a higher FPS than a top-down isometric game.

The past TV (CRT) screens also had some natural smoothing between the frames because the former frame faded only slowly and blended with the next. It’s been less sharp this way, but motions appeared smoother, and the screen flickering was less notable.

Computer screens always had less of this smoothing effect, I think intentionally, and needed higher refresh rates.

Ugh, it was like 4 in the morning… I had one of those “SOMEONE’S WRONG ON THE INTERNET”-moments.

True, and also why motion blur in games is so important since it increases the perceived frame rate. We still need at least 60Hz, but 60 FPS with a small amount of motion blur looks a lot smoother. I generally keep it off in shooting games since it’s easier to see details when the game doesn’t have motion blur, but I always leave it enabled in “movie” games like Crysis.

Ironically motion blur is one of those things likely to cost a lot of frames-per-second in the first place…

Cas :slight_smile:

At 1080p:

0.3 ms when screen is 100% stationary (fast path in shader).
1.7 ms when the screen is 100% in motion.

1.7ms is enough to drop you from 60 FPS to 54 FPS, which is still well worth it in my opinion. My motion blur is also far from the fastest implementation, but it’s still easy to tweak by clamping the maximum motion blur radius and the number of samples read for blurring. It’s a feature that runs almost 100% on the GPU (some negligible CPU resources are needed to calculate motion vectors), and it scales with screen resolution since it’s a postprocessing filter. If all else fails, you can just disable it.

It doesn’t matter.

  • Jev