I have a game that runs really slow on machines with integrated graphics, even though it’s just a 2d game. I’m working on creating options that will turn off some features to allow the game to run faster. It’s not worth it to me to switch to JOgl for a game that’s already supposed to be finished.
How do I tell whether the system has integrated graphics? I want the program to set the default options based upon how fast the computer is. I tried the following code:
GraphicsDevice graphicsDevice = GraphicsEnvironment.getLocalGraphicsEnvironment().
getDefaultScreenDevice();
System.out.println("Video Memory: " + graphicsDevice.getAvailableAcceleratedMemory());
This prints out -1 on my Mac Mini, which has integrated graphics. However, referring to the API, -1 means “unlimited” available accelerated memory. Clearly that isn’t the case.
Will machines with integrated graphics always output -1? Will machines with discrete graphics always output a real value? Or is there something else that I have to do?
Should I have some kind of test sequence that renders a bunch of stuff quickly, times it, and then decides what options to use? I’ve seen this in older Sierra games, but I don’t know how well it works. It seems like I’d actually have to display stuff to the screen to get an accurate reading. Even then, this test sequence would most likely be done before the JVM warms up and thus not really be accurate.