BufferStrategy.show() for a Canvas not synced?

BufferStrategy.show() doesn’t seem to stall until the vblank when attached to a Canvas object. Is this by design? I don’t see how to resolve my issues with inconsistent frame rates if I don’t have any way of knowing when stuff gets put on the screen.

I’m basing the assumption that it isn’t stalling on the fact that my frame rates still exceed the refresh rate of the monitor for simple drawing. Has anyone here successfully used a BufferStrategy to control their frame rate while not in fullscreen exclusive mode?

Thanks for any info,
eli curtz

You have to be in full screen mode to use the vertical sync.

That’s what I was afraid of. Thanks for the information. So unless my frame rate is much greater than (or much less than) my monitor refresh rate I’m going to get frames displaying on screen for uneven intervals…

  • eli

All vsync fixes is tearing.

You can also use it as a timing mechanism.
(i.e. render as fast as you can, and let bufferStrategy.show() block waiting for the next vsync signal)

However I advise against it - several platforms, and even more gfx cards do not report the refreshrate correctly.
(for instance atm Win98 reports all DisplayModes as having a refreshrate of DisplayMode.REFRESH_RATE_UNKNOWN, regardless of the actual refreshrate ???)

My suggestion is to use a highres timer, and either framerate lock your app, or have a variable timescale.
The choice of which of these 2 approaches depends on the type of game you are writing.

I wanted to use it to guarantee my frame rate was a division of the refresh rate. I have a high resolution timer (we’re using QTJava) but it isn’t as useful if I can’t stay synchronized with the screen refresh. I assume that even if Windows couldn’t report the correct refresh rate it would still deliver the interrupt (or whatever it is) at the correct time for page-flipping to occur.

Unfortunately I can’t use a totally standard timestep as we are doing a suite of simulations which have widely varying complexity. i.e. on my machine they range from 15 fps to 150 fps or so without any limiter turned on. So that leaves me with a variable frame length which unfortunately is often approximately the same length as my vsync, which is bad because it causes visual stuttering for the user. (If you think about a clock versus somebody counting seconds you can see the problem, over a long time they are fairly close, but any given second is likely to contain two or zero counts by the person.)

  • eli

[quote]All vsync fixes is tearing.

You can also use it as a timing mechanism.
(i.e. render as fast as you can, and let bufferStrategy.show() block waiting for the next vsync signal)

However I advise against it - several platforms, and even more gfx cards do not report the refreshrate correctly.
(for instance atm Win98 reports all DisplayModes as having a refreshrate of DisplayMode.REFRESH_RATE_UNKNOWN, regardless of the actual refreshrate ???)

My suggestion is to use a highres timer, and either framerate lock your app, or have a variable timescale.
The choice of which of these 2 approaches depends on the type of game you are writing.
[/quote]
Win9x reports UNKNOWN only if you have set optimal, means Win9x choose it, but it doesnt matter, Vsync will occur under Win9x its safe

yeah I know the vsync signal is still sent, but without knowing the actual refreshrate being used, u can’t us it for timing :frowning:

Didn’t know that its caused by the driver refreshrate setting being on optimal though, cheers :wink:

I wonder how native games manage to switch to a specific refreshrate :confused:

I know DirectX with C/C++ is more flexible, but it can be pain to do simple things.

One workaround for this is.

Switch to Fullscreen, then DO NOT RENDER ANYTHING SIMPLE SWITCH BUFFERS AND CALCULATE SWITCH TIME, SO U GET APPROX. REFRESH RATE.

I get 149Hz in 2 seconds, its enough for me.

Then u can start rendering and start the game, watch the commercial games, there is always a small lag at startup, they do the same. ;D