setSwapInterval(1) high CPU load

In my jogl application i use FPSAnimator with 60 fps. I was developing on Linux and asked a friend to test the application. He is using a Mac and said that CPU load is at 30…40% when running it. But on my Linux the CPU load is something between 0…1% so i tested the same application on the same machine but running Windows Vista - with the same result as my friend on his MAC: Unusual high CPU load.

setSwapInterval(0);

is what i added and suddenly CPU load was very low on Windows too. In both cases the FPS is always around 60 but the CPU load is very different.

I can render 1000 FPS with lower CPU load than on Vsync with 60 FPS, kinda strange ?

I am also having this problem under Vista :’(. I am porting a program which was previous written in C# (using Tao framework and/or OpenTk), both of which implement Vsync without this problem. . .

There must be a fundamentally better way of implementing Vsync than this. Presumably somewhere the thread just ends up spinning eating up CPU cycles until it is time to swap the buffers.

Is anyone able to shed any light on this problem?

Note - I also observe that this problem does not occur under Linux (Ubuntu 64bit). The same program which saturates my CPU under Vista uses only ~5% CPU with setSwapInterval(1).

I am using an nVidia card (using the official drivers on both operating systems).

puzzling …

For sure we don’t use a CPU busy loop in our code,
so it must be the other OS then :slight_smile: hmm …

How behave other native demos with swapInterval 0 and 1 ?

What exactly do you want me to test?

I’ve profiled a simple JOGL example which just draws a blank screen on a GLCanvas to see where the CPU is actually being saturated. According to Jprofiler, the method java.awt.EventDispatchThread.run is invoked once (as expected) and saturates the CPU. This thread calls GameGLEventListener.display for each frame, however (this is the interesting bit) the total time spent in the GameGLEventListener.display method is only 5% of the total CPU time consumed by EventDispatchThread.run.

. . .So this points to something in EventDispatchThread.run but not inside GameGLEventListener.display eating the CPU!? According to the profiler EventDispatchThread.run doesn’t call any other methods (at least not any which take up more than 0.1% CPU time). Very strange.

I’ve tested exactly the same code on another Vista machine, and the CPU utilisation is fine!

I can only guess that it must be a dodgy implementation of the native SwapBuffers method on this particular machine. I wonder if it is the graphics driver (unlikely given I’m using the latest nVidia ones) or the OS (more likely - my version of Vista is from MSDNAA and has always had a few problems).

If you google search “SwapBuffers busy wait” it seems that other people also have a similar problem.

Still, this doesn’t explain how the Tao/OpenTK implementation doesn’t suffer this problem. Maybe they use a different method to achieve vsync, or have explicit sleeps in their code.

Here’s what I do, but it may not be applicable to you because I
manage my own OpenGL rendering thread and do not use the
EDT. That is, my OpenGL rendering is separate from the EDT so that
my UI is not blocked by OpenGL calls.


I typically issue a Thread.sleep() before it is time to swap buffers,
in order to spend as little time as possible inside the swapBuffer()
call itself.

You will need to measure how much time your frame takes and
calculate the amount to sleep accordingly. For 60fps, each frame
is 16.667ms long. If your frame takes 5ms to render, then you can
sleep for 10ms and wake up just in time for the swap. You need
to give the system at least 1-2ms buffer due to timer inaccuracies,
so the amount to sleep needs to be clamped. On Windows, the
timer can be made go at 1KHz using this piece of code at the
start of your app:


    /*
     * Workaround to enable hi-res timer on Windows.  See 6435126.
     * 
     * ForceTimeHighResolution switch doesn't operate as intended
     * http://bugs.sun.com/view_bug.do?bug_id=6435126
     * 
     * Calling Thread.sleep with small argument affects system clock on windows
     * http://bugs.sun.com/view_bug.do?bug_id=4500388
     */
    private static class LongSleepingThread extends Thread 
    {
        public LongSleepingThread() 
        {
            super("HiRes Bugfix (Windows)");
            setDaemon(true);
        }
        
        public void run() 
        {
            while (true) 
            {
                try { Thread.sleep(Integer.MAX_VALUE); }
                catch (InterruptedException e) {}
            }
        }
    }

.rex