color depth problems with LWJGL

hi,

i’m working on my first LWJGL-based game and noticed that on my workstation the images are rendered like 8bit color depth while on my laptop everything looks as expected.

i’m referencing the LWJGL libraries in my jnlp like this:

so i should get the newest stable libs, right?

the java console output on my workstation looks like this:


GLApp.initDisplay(): Current display mode is 1280 x 1024 x 32 @75Hz
checking display mode 800x600 @ 75Hz - 32 bits per pixel
checking display mode 640x400 @ 60Hz - 16 bits per pixel
checking display mode 1024x768 @ 85Hz - 32 bits per pixel
checking display mode 640x480 @ 75Hz - 16 bits per pixel
checking display mode 1280x800 @ 85Hz - 16 bits per pixel
checking display mode 1280x1024 @ 60Hz - 32 bits per pixel
checking display mode 1280x800 @ 75Hz - 16 bits per pixel
checking display mode 640x480 @ 85Hz - 16 bits per pixel
checking display mode 720x400 @ 70Hz - 32 bits per pixel
checking display mode 1024x768 @ 75Hz - 32 bits per pixel
checking display mode 1360x768 @ 60Hz - 16 bits per pixel
checking display mode 800x600 @ 85Hz - 32 bits per pixel
checking display mode 1024x768 @ 75Hz - 16 bits per pixel
checking display mode 720x400 @ 70Hz - 16 bits per pixel
checking display mode 640x480 @ 85Hz - 32 bits per pixel
checking display mode 1280x800 @ 75Hz - 32 bits per pixel
checking display mode 1360x768 @ 60Hz - 32 bits per pixel
checking display mode 800x600 @ 85Hz - 16 bits per pixel
checking display mode 800x600 @ 75Hz - 16 bits per pixel
checking display mode 640x400 @ 60Hz - 32 bits per pixel
checking display mode 1280x1024 @ 60Hz - 16 bits per pixel
checking display mode 1280x800 @ 85Hz - 32 bits per pixel
checking display mode 640x480 @ 75Hz - 32 bits per pixel
GLApp.initDisplay(): Setting display mode to 640 x 480 x 32 @75Hz with pixel depth = 32
creating Display/PixelFormat with values 0, 32, 0 ...
GLApp.initDisplay(): Failed to create OpenGL window: org.lwjgl.LWJGLException: Could not find a valid pixel format
trying again with depthBufferBits 24 ...
lookin' good ...

my graphics card is a geforce 6800 and i never had any problems with it.
another thing is that on my laptop the game runs with ~ 300 to 500 fps while on the workstation i only get around ~70 fps.
any ideas what the problem could be?

here’s how the output looks like (the logo should have a smooth gradient):

Try rendering a color-gradient triangle (no texture) and see how that gets rendered.

I’ve seen driver-versions that required RGB8 / RGBA8 instead of RGB / RGBA for the internal texture-formats.

If the color-gradient looks rough though, your framebuffer has a low color-depth.

Check your driver settings. Usually there is something quality<->speed slider. If it’s set to speed you may see banding and other artifacts.

i played around with the quality settings on my office dual-monitor system (where i have the same problem as on my private workstation with only one monitor) and now i have a nice gradient on the second monitor but on the primary monitor the problem is still there. strange…

try to switch off texture compression (s3tc) manually for that texture (in your app).

Looks like you’ve got 16bit texture and it’s doing the conversion for you. Try using a destination pixel format of:


    	dstPixelFormat = GL11.GL_RGBA16;

During texture loading/creation.

Kev

cool, that solved the problem. thx a lot!