On my 60Hz monitor, I can always tell a difference between 60 and 30 fps. 30 is playable but it is choppy.
They may have, depending on what rate the game is updating at and how its implemented. A commonly used gameloop is the one where you update at even intervals and then render when you have time over. Let’s say you have an unusually high update rate of 100 Hz (hereafter referred to as UPS, updates per second). Let’s calculate the average input delay of a high-end gamer with proper gaming gear and a computer that can achieve 100 FPS and a low-end “casual” gamer who gets 40 FPS.
The most important difference when it comes to FPS and input delay here is what happens when the FPS drops below the UPS. At 100 UPS the game should update every 10 millisecond. However, at 40 FPS the rendering part of the game loop takes 25 milliseconds by itself, meaning that although the game still updates at 100 UPS, they’re not evenly spread out.
100 FPS and 100 UPS
[update][update][update][update][update]
40 FPS and 100 UPS
[update][update]<------render------->[update][update][update]<------render------->
In general, game updating is often a magnitude or more faster than rendering a frame, so in practice the 40 FPS computer is reading input 2-3 times within a very short period, which effectively means that only the first read has a reasonable chance of catching any new input. The result is that although the game speed and all is constant regardless of FPS, the input reading effectively happens in sync with your framerate, not your update rate. 100 FPS = 0-10ms delay until the input is detected, 40 FPS = 0-25ms.
[tr][td][/td][td]Gamer[/td][td]Causal[/td][/tr]
[tr][td]Mouse/Keyboard USB polling delay[/td][td]1000Hz = average 0.5ms[/td][td]100Hz = average 5ms[/td][/tr]
[tr][td]Average delay until game-loop notices and reads input[/td][td]100FPS = average 5ms[/td][td]40FPS = average 12.5ms[/td][/tr]
[tr][td]Time needed for the GPU to render a frame[/td][td]100FPS = 10ms[/td][td]40FPS = 25ms[/td][/tr]
[tr][td]Monitor update time[/td][td]2ms gaming monitor[/td][td]8ms slow monitor[/td][/tr]
[tr][td]Total[/td][td]17,5[/td][td]50,5[/td][/tr]
In practice, the time needed for the GPU to render a frame is most likely significantly higher due to the GPU lagging behind the CPU to assure that it always has stuff to do. This is also amplified by a lower FPS, since the GPU starts limiting the CPU when either when it’s X frames behind or when it has Y commands queued up. Being 3 frames behind at 100 FPS is only 30ms extra delay, but at 40 FPS we’re talking about 75ms.
Carmack is doing a lot of 120Hz because of the Oculus
He said even just using windows in 120hz with a 120hz monitor the cursor is much smoother, let alone games
anyone who says that 60 is even close to the limit we can see has never really researched this topic and is just repeating hearsay, aka being an idiot.
[quote]240~ - They claim this is the absolute top, anything beyond this would just be silly. But the difference between 120 -> 240 are almost unnoticeable, even to the trained professionals. Anyone claiming they can tell the difference between 240 and 480 are just getting a placebo effect. (Quite frankly there’s a good chance thats true between 120 and 240 too)
[/quote]
We’ll see. Its very easy to test.
Show me something 30fps and something 60fps for 2 seconds, one time, I will always be able to tell them apart. Its night and day. With that I know that 60 and 120 will still be quite noticable and I wouldnt be surprised if there is more room up. But we’ll see.
We’ll see. Its very easy to test.
Show me something 30fps and something 60fps for 2 seconds, one time, I will always be able to tell them apart. Its night and day. With that I know that 60 and 120 will still be quite noticable and I wouldnt be surprised if there is more room up. But we’ll see.
[/quote]
I have a $3000 TV that does 240Hz. Although it’s that MotionFlow 240Hz stuff, because there’s no way to even transmit real 240FPS to a device over HDMI at 1080p, I don’t even think dual DVI can do it. I feel like I can tell the difference between 120hz and 240hz, but it’s negligible. Honestly I feel it’s just the placebo effect.
MotionFlow 240hz only exists because it allows 120Hz 3d playback, and you can see the difference there… even though 3D sent over HDMI at 1080p is only 24Hz/24FPS. DVI on the other hand can break into nice 60FPS 3d though (takes a 120Hz TV to do it though). But really, using my MotionFlow for 240hz is kinda silly. It’s just a byproduct of having MotionFlow 3D at 120Hz.
I don’t know about other games, but take league of legends for example. I play only on 40-70 fps and it really seems choppy. My bro’s pc runs 120-150 fps and it seems so much smoother. Its not all about fps in LoL. The more fps, more responsive controls. You can really feel the difference between 60 and 120 no matter what eye scientists tell you.
The thing about 60 vs 120 is… it’s a law of diminishing returns. 60 is as good as it needs to get for nearly everyone. Although a lot of people will see improvements at 120Hz, they’ll already be happy at 60Hz. The main problem with rendering at 120Hz is that you literally need twice the processing power and fancy hardware, so you’re already limiting your audience drastically.
Cas 
That escalated quickly. I’ve never seen a person so pissed off about a framerate.
Well, you’ve got to give it to him that he said please. Always a gentleman.
In my experience framerate jitter (particularly sudden spikes and drops) disturbs the eye more than a fairly low but very constant framerate.
It’s debatable what exactly “fairly low” and “too much jitter” is, but instead of aiming at something above 60Hz, I’d try to aim at a rock solid 60Hz rate.
If you would shoot a 24FPS Theatrical Movie using a camera that has an exposure of only 1/1000 of a second per frame, the movie would apear choppy.
The reason movies in the cinema run smooth, is that each frame has an exposure of almost 1/24 of a second.
It picks up the change in movement during this time.
Thats the reason computer games appear choppy at even higher FPS,
because their frames represent still snapshots of the scene, not a scene in movement.
eg: the faster your scene changes in a game, the higher the FPS shall be.
A first person game profits more from a higher FPS than a top-down isometric game.
The past TV (CRT) screens also had some natural smoothing between the frames because the former frame faded only slowly and blended with the next. It’s been less sharp this way, but motions appeared smoother, and the screen flickering was less notable.
Computer screens always had less of this smoothing effect, I think intentionally, and needed higher refresh rates.
Well, you’ve got to give it to him that he said please. Always a gentleman.
Ugh, it was like 4 in the morning… I had one of those “SOMEONE’S WRONG ON THE INTERNET”-moments.
If you would shoot a 24FPS Theatrical Movie using a camera that has an exposure of only 1/1000 of a second per frame, the movie would apear choppy.
True, and also why motion blur in games is so important since it increases the perceived frame rate. We still need at least 60Hz, but 60 FPS with a small amount of motion blur looks a lot smoother. I generally keep it off in shooting games since it’s easier to see details when the game doesn’t have motion blur, but I always leave it enabled in “movie” games like Crysis.
Ironically motion blur is one of those things likely to cost a lot of frames-per-second in the first place…
Cas 
At 1080p:
0.3 ms when screen is 100% stationary (fast path in shader).
1.7 ms when the screen is 100% in motion.
1.7ms is enough to drop you from 60 FPS to 54 FPS, which is still well worth it in my opinion. My motion blur is also far from the fastest implementation, but it’s still easy to tweak by clamping the maximum motion blur radius and the number of samples read for blurring. It’s a feature that runs almost 100% on the GPU (some negligible CPU resources are needed to calculate motion vectors), and it scales with screen resolution since it’s a postprocessing filter. If all else fails, you can just disable it.
It doesn’t matter.
- Jev
Heh. Maybe on your fancypants GPU, not on my crappy ones
Just turning on distortions on my Intel GPU cuts the framerate in half…
Cas 
On a Galaxy GeForce GT 610. Not that fancy.
As long as your game can run at 60 FPS with no issues, on an average computer, there’s nothing to worry about. 120 FPS is just silly.
- Jev
I have found that the people who actually care about the perfect 120+fps are gamers. And I mean gamers not your average COD players. This means that the lowest common denominator doesn’t care what the frame rate is as long as the game plays well.
Most people aren’t rocking monitors above 120hz so why bother. It would be nice to have the game compatible for 60+ but they people with hardware to get that are even fewer. Its like those few PC (as we know consoles ain’t hitting 120 anytime soon) gamers that have 120hz and the pimped hardware expect all this fancy stuff when they forget that their a small part of the market.
I think it will be a while before I look into 60+ fps as I don’t have a monitor at above 60hz.
I always think it worthwhile to remember a silly factoid:
[quote]You don’t see the world with your eyes, you see it with your brain.
[/quote]
What this means is that one single feature of the images being shown (in this case framerate) is not enough to grasp how it affects our perception.
Our brain does a lot of post-processing before our consciousness “sees” the image, and much of that post-processing is affected by the contents of the image themselves, which is why we get optical illusions.
This, on one hand, makes studies on the effects of FPS somewhat suspect, specially if they just fixate on the eye, but it also casts doubt on personal experiences of the “I can feel the difference” kind, sometimes most of the time we see what we want to see.
What I’m trying to say is that FPS alone is meaningless without a context. The details of image you are presenting has probably more power over how smooth it is perceived than the number of frames themselves.
Here are some funny examples of how our brain fails miserably at seeing the world.
The past TV (CRT) screens also had some natural smoothing between the frames because the former frame faded only slowly and blended with the next. It’s been less sharp this way, but motions appeared smoother, and the screen flickering was less notable.
Computer screens always had less of this smoothing effect, I think intentionally, and needed higher refresh rates.
Are you sure about that? I always thought that CRTs had much faster response times and needed at least 100Hz to avoid flicker.