FPS vs monitor refresh rate

Ok , i feel dumb asking this but…
I was following a youtube infamous tut where , alongside the usual 60 updates / sec
the frame rate is very big , something like 400 or more frames / sec.
Now , where is the point in such high FPS if you will however see only the frames allowed by you monitor’s refresh rate ?
Thx !

There are monitors that can show more than 60 FPS; some up to 156 FPS.

If your rendering and update logic are coupled it makes sense to have the game run at more than 60hz, it reduces the time between updates and thus the input lag. This matters more for twitchy games than for rts or rpgs, for the latter it’s more important to sync the monitor and the game to give steady update intervals.

If your game logic and rendering are decoupled (running in parallel) you might as well run the logic at 200, 300, 400… fps and the rendering synced to the monitors refresh rate.

You have to keep in mind performance, of course. Having your game update at extremely high rates will cause more system resource usage, something occasionally in short supply. In addition, if you game is using a fixed update timestep (one without a delta being passed to all update methods), and the update rate changes, so too will the rate of gameplay.

IMHO, keep it locked to 60Hz and design around that.

Cas :slight_smile:

How well does that work with a 75Hz screen?

It causes the very very very small number of people with 75Hz screens, who know they’ve got a 75Hz screen, and realise that every so often the screen doesn’t update, to occasionally make small whinging noises, which you can ignore.

Cas :slight_smile:

I hate playing 60hz games on my 144hz monitor. Makes me turn the game off. So there’s that.

Unless it’s a 2d game, then it’s much harder to tell.

Having just upgraded my screen, I was surprised to see how many of those there are out there now! Along with lots of other non-60Hz displays.