Timing of swapBuffers

How can I determine exactly when thw GLDrawable.swapBuffers() command actually swaps the buffers?
Is there a way of waiting for the swapBuffers() to complete?

I have a simple program which rotates a viewpoint by 1 degree every time display() is called. Thus a complete 360 degree revolution should take 360 calls to display(). With a 60Hz refresh rate I would expect this to take 6s assuming the rendering can be done in less than 1/60th of a sec. However, the scene rotates much faster especially when the window is small. This would indicate that display() is called more frequently and that swapBuffers() is called at a rate higher than the refresh rate. I’ve tried bot the default automatic buffer swapping and disabling this to do a manual swap. PS. I’m using Windows XP.

A better solution would be to divorce the rate of rotation from your frame-rate.
For example, you want to rotate 360 degrees in 6 seconds, that’s a rate of 60 degrees per second. Measure the time difference in seconds between calls to your rendering code, multiply it be the rotation-per-second figure and add that to the rotation angle.
This’ll give you a consistent speed of rotation, no matter what the frame-rate is.

gl.setSwapInterval(1)

I’ve already tried that with both both 1 and 0 and observed no difference.

But I still wouldn’t know the actual frame rate and frame timings. My aim is to use real time captured data to control the viewpoint.
It is important to know 1) the latency between capture and display time, 2) any variations in this latency, 3) what the frame rate is and if any frames are dropped.

Then your driver does not implement/expose this functionality in it’s GL implementation or ignores it due to speed/quality settings in your drivers control panel. Or maybe you are doing it at the wrong place in your code (it usually should be in init()).

However, even though you vsync the app it might be running in less or more fps.

This would mean that someone with a 120hz screen would potentially spin the cube in 3s and someone with a slow computer might spin it in 40 seconds. What are you trying to achive? If it is a smooth rotation you should look into what ryanm said.

Mike

The spinning world is only a toy example. Yes I could get a smooth consitent spin rate using other methods.

I think what I really want to know is precisely when a rendered frame is displayed (swapped into the visible buffer).
At 60Hz refresh rate, this could be somewhere between 0 and 16ms (1/60th second) after the render is completed and swapBuffers is called. This uncertainty in latency is a potential concearn for applications where achieving a consitent and minimal latency, and knowing what these latency variations are is important.

Well, that’s why you need vsync.

Cas :slight_smile:

The problem may be due to the graphics hardware not supporting vsync.
New computer with a decent graphics card has solved the probolem.