30 vs 60 vs 120 FPS

So I did a quick search didn’t find anything on this and was wondering. Do you think there are huge noticeable differences between 30, 60 and 120 fps? If so, by how much? Is 120 that much better than 60? Does the frame rate (not linked to logic) really effect game and by how much?

This is because some dude was ripping on me when I said that from 60 to 120 there isn’t such a huge noticeable difference. Most wouldn’t even care to notice.

Sorry if this was talked about already but I couldn’t find anything.

Your eyes can’t tell the difference between 60-120, a regular person begins to notice choppiness at around 40 fps

The higher the FPS, the less noticable changes should be(i.e. more smooth) per frame. With lower FPS, you have to have more movement changes per frame to achieve the same difference in the same amount of time when rendering graphics. 120 FPS should be the least noticable, but i dont think human eyes can process the difference higher than like 30-60.

30-60 fps is standard in gba games iirc.

I can notice the difference between 60 and 120 but only during competitive gameplay and only if the monitor is running at 120Hz.

https://boallen.com/fps-compare.html
http://frames-per-second.appspot.com/
Between 15, 30, and 60 there is massive difference. Those who say otherwise don’t know what they’re talking about.

As for 120fps, I think many can still see improvement (I certainly can), but that only happens if they have a 120Hz monitor (rare), and may not be worth the effort (to optimize, etc.)

Clip of Battlefield at 60 and 120, both slowed down so you can view on a 60hz monitor, although that doesn’t give quite the same level of comparison: https://www.youtube.com/watch?v=qwkFj-hYtkw

Also monitor response time is critical, I can still detect ghosting at 60 on mine :\

even if your eyes cant process 120fps, i higher frame rate would start to blur together which would appear more natural to your brain. real life runs at pretty high fps, each frame you perceive is an amalgamation of many.

15-30 Huge difference.

30-60 More fluid but not as much as 15-30

60-120 I see a slight difference. That is I can tell but I need to really pay attention.

The human eye can pick up light from like miles away and detect flashes in tiny fractions of a second so of course we can see at more that 30/60/9999fps. The thing is, diminishing return. At how much faster do things need to run to get the return of more fluid movements. I think that 60 should be the standard. This is because 60 gives the best results without making things unreasonably slow.

What about gameplay? does it really effect it? If the simulation is at, say 60 logic cycles and someone plays at 120 fps, do they have an advantage? I know that input can be tied to framerate but we will say that the rate the frames show doesn’t effect anything.

Important numbers: (Some of these are from memory and may be slightly off)

~24 fps. The human eye sees it as a moving picture rather than a series of frames.
~48 fps. Anything lower will cause strain to your eyes over long period of time.
60 fps. Refresh rate of most monitors. Any more than 60fps on a 60hz monitor is a waste of processing power as you’re writing to the framebuffer multiple times between reads.

I have a monitor overclocked to 90Hz and I can easily see a single frame being dropped, despite the more than usual motion blur my screen has. I had a 144Hz screen for a while which it was a little harder to see a framedrop on, but they sure as hell were there. You’re not speaking for everyone, so please kindly shut up. I’m getting tired of this argument already. It’s on the same level as claiming that everyone needs glasses.

Game logic cycles shouldnt be tied to frame rate. It was fine to do that in the past but with so many different machines and devices, and operating systems that can share the cpu out to other applications at any time they want, it just doesnt work very well anymore.

You will hear a lot of hoopla about the “maximum frame rate an eye can see” is 30-40FPS (and that numbers always changing), but those people don’t have a clue what they’re talking about.

Yes, most recent research shows our eyes seem to process information at about the same at 30-40~ frames per second, but our eyes are NOT in perfect sync with the image we are viewing. If we could sync our eyes absolutely perfectly with our monitors and the frame rate stayed at a perfect, absolutely solid unwaving framerate that matched our eyes (because it’s not constant either) then 30-40~ frames per second would be the smoothest video in the entire universe, we couldn’t tell the difference between it and real life.

But, the reality is, since our eyes process the data at a different rate, different ways and out of sync with the video, we need even more frames to process to keep everything smooth because we (for lack of a better word) need “filler frames”. From the latest/greatest in TV/Monitor tech:

Under 30FPS - Choppy, but if consistent, still “good”. Believe it or not, a lot of movies actually run at around 30FPS. Did you know a standard Bluray disc only runs at 24 frames per second?
60~ - Better for gaming or media that’s frame rate isn’t consistent, giving our eyes the additional data required to process the information to a smooth image.
120~ - Getting to the point our eyes wont be able to tell the difference, there’s so many additional frames that our eyes have plenty of data to translate to our brains.
240~ - They claim this is the absolute top, anything beyond this would just be silly. But the difference between 120 -> 240 are almost unnoticeable, even to the trained professionals. Anyone claiming they can tell the difference between 240 and 480 are just getting a placebo effect. (Quite frankly there’s a good chance thats true between 120 and 240 too)

So I say the “holy grail” frame rate for gaming is 120FPS. But like BurntPizza said, you need a 120Hz monitor to get the full benefit. Although 120FPS on a 60Hz monitor is still better than 60FPS because if you have dips/skips they wont even be visible on a 60Hz monitor unless those dips go below 60FPS.

Regardless though, I would design your games to run at 60FPS, that’s pretty much the standard. Running it at 120FPS takes a ton of extra processing power for a very minor difference.

http://boallen.com/fps-compare.html

As stated, these arguments are pointless. The demo above shows the difference between each FPS. On a 60Hz monitor, you can’t see any noticeable difference between 120 FPS and 60 FPS. For gaming, 60FPS is the standard, but do not go below 30 FPS. Hopefully, this post supplements the posts above…

On my 60Hz monitor, I can always tell a difference between 60 and 30 fps. 30 is playable but it is choppy.

They may have, depending on what rate the game is updating at and how its implemented. A commonly used gameloop is the one where you update at even intervals and then render when you have time over. Let’s say you have an unusually high update rate of 100 Hz (hereafter referred to as UPS, updates per second). Let’s calculate the average input delay of a high-end gamer with proper gaming gear and a computer that can achieve 100 FPS and a low-end “casual” gamer who gets 40 FPS.

The most important difference when it comes to FPS and input delay here is what happens when the FPS drops below the UPS. At 100 UPS the game should update every 10 millisecond. However, at 40 FPS the rendering part of the game loop takes 25 milliseconds by itself, meaning that although the game still updates at 100 UPS, they’re not evenly spread out.

100 FPS and 100 UPS
[update][update][update][update][update]

40 FPS and 100 UPS
[update][update]<------render------->[update][update][update]<------render------->

In general, game updating is often a magnitude or more faster than rendering a frame, so in practice the 40 FPS computer is reading input 2-3 times within a very short period, which effectively means that only the first read has a reasonable chance of catching any new input. The result is that although the game speed and all is constant regardless of FPS, the input reading effectively happens in sync with your framerate, not your update rate. 100 FPS = 0-10ms delay until the input is detected, 40 FPS = 0-25ms.

[tr][td][/td][td]Gamer[/td][td]Causal[/td][/tr]
[tr][td]Mouse/Keyboard USB polling delay[/td][td]1000Hz = average 0.5ms[/td][td]100Hz = average 5ms[/td][/tr]
[tr][td]Average delay until game-loop notices and reads input[/td][td]100FPS = average 5ms[/td][td]40FPS = average 12.5ms[/td][/tr]
[tr][td]Time needed for the GPU to render a frame[/td][td]100FPS = 10ms[/td][td]40FPS = 25ms[/td][/tr]
[tr][td]Monitor update time[/td][td]2ms gaming monitor[/td][td]8ms slow monitor[/td][/tr]
[tr][td]Total[/td][td]17,5[/td][td]50,5[/td][/tr]

In practice, the time needed for the GPU to render a frame is most likely significantly higher due to the GPU lagging behind the CPU to assure that it always has stuff to do. This is also amplified by a lower FPS, since the GPU starts limiting the CPU when either when it’s X frames behind or when it has Y commands queued up. Being 3 frames behind at 100 FPS is only 30ms extra delay, but at 40 FPS we’re talking about 75ms.

Carmack is doing a lot of 120Hz because of the Oculus

He said even just using windows in 120hz with a 120hz monitor the cursor is much smoother, let alone games
anyone who says that 60 is even close to the limit we can see has never really researched this topic and is just repeating hearsay, aka being an idiot.

[quote]240~ - They claim this is the absolute top, anything beyond this would just be silly. But the difference between 120 -> 240 are almost unnoticeable, even to the trained professionals. Anyone claiming they can tell the difference between 240 and 480 are just getting a placebo effect. (Quite frankly there’s a good chance thats true between 120 and 240 too)
[/quote]
We’ll see. Its very easy to test.
Show me something 30fps and something 60fps for 2 seconds, one time, I will always be able to tell them apart. Its night and day. With that I know that 60 and 120 will still be quite noticable and I wouldnt be surprised if there is more room up. But we’ll see.

We’ll see. Its very easy to test.
Show me something 30fps and something 60fps for 2 seconds, one time, I will always be able to tell them apart. Its night and day. With that I know that 60 and 120 will still be quite noticable and I wouldnt be surprised if there is more room up. But we’ll see.
[/quote]
I have a $3000 TV that does 240Hz. Although it’s that MotionFlow 240Hz stuff, because there’s no way to even transmit real 240FPS to a device over HDMI at 1080p, I don’t even think dual DVI can do it. I feel like I can tell the difference between 120hz and 240hz, but it’s negligible. Honestly I feel it’s just the placebo effect.

MotionFlow 240hz only exists because it allows 120Hz 3d playback, and you can see the difference there… even though 3D sent over HDMI at 1080p is only 24Hz/24FPS. DVI on the other hand can break into nice 60FPS 3d though (takes a 120Hz TV to do it though). But really, using my MotionFlow for 240hz is kinda silly. It’s just a byproduct of having MotionFlow 3D at 120Hz.

I don’t know about other games, but take league of legends for example. I play only on 40-70 fps and it really seems choppy. My bro’s pc runs 120-150 fps and it seems so much smoother. Its not all about fps in LoL. The more fps, more responsive controls. You can really feel the difference between 60 and 120 no matter what eye scientists tell you.

The thing about 60 vs 120 is… it’s a law of diminishing returns. 60 is as good as it needs to get for nearly everyone. Although a lot of people will see improvements at 120Hz, they’ll already be happy at 60Hz. The main problem with rendering at 120Hz is that you literally need twice the processing power and fancy hardware, so you’re already limiting your audience drastically.

Cas :slight_smile:

That escalated quickly. I’ve never seen a person so pissed off about a framerate.

Well, you’ve got to give it to him that he said please. Always a gentleman.