32-bit Graphics, getting dithered down to 16-bit?

Hey guys, I just recently made the switch to LWJGL from Java2D and I have to say I love it :slight_smile:

Anyway, I’m having a slight problem: My 32-bit graphics are getting dithered down to 16-bit, and then being displayed on my 32-bit screen. Argh! Any ideas?

Example:

http://kaioa.com/k/vagecmp.png

I’ll try to post any relevent code blocks as well:

	public static QTexture fromFile( String name, boolean flip )
	{
		QTexture texture = null;
		ByteBuffer imageData = null;
		int ilImageHandle;
		int oglImageHandle;
		IntBuffer scratch = BufferUtils.createIntBuffer(1);
		
		try{
			IL.create();
			ILU.create();
			ILUT.create();
		} catch (LWJGLException ex) {
			
		}
		// create image in DevIL and bind it
		IL.ilGenImages(scratch);
		IL.ilBindImage(scratch.get(0));
		ilImageHandle = scratch.get(0);
		
		// load the image
		if( !IL.ilLoadImage( name ) )
			return null;
		
		// convert image to RGBA
		IL.ilConvertImage(IL.IL_RGBA, IL.IL_BYTE);
		
		// flip
		if( flip )
			ILU.iluFlipImage();
		
		// get image attributes
		int width = IL.ilGetInteger(IL.IL_IMAGE_WIDTH);
		int height = IL.ilGetInteger(IL.IL_IMAGE_HEIGHT);
		int textureWidth = QUtil.getNextPowerOfTwo(width);
		int textureHeight = QUtil.getNextPowerOfTwo(height);
		
		// resize image according to poweroftwo
		if (textureWidth != width || textureHeight != height) 
		{
			imageData = BufferUtils.createByteBuffer(textureWidth * textureHeight * 4);
			IL.ilCopyPixels(0, 0, 0, textureWidth, textureHeight, 1, IL.IL_RGBA, IL.IL_BYTE, imageData);
		} 
		else 
			imageData = IL.ilGetData();
		
		// create OpenGL counterpart
		GL11.glGenTextures(scratch);
		GL11.glBindTexture(GL11.GL_TEXTURE_2D, scratch.get(0));
		oglImageHandle = scratch.get(0);
		
		GL11.glTexParameteri(GL11.GL_TEXTURE_2D, GL11.GL_TEXTURE_MIN_FILTER, GL11.GL_LINEAR);
		GL11.glTexParameteri(GL11.GL_TEXTURE_2D, GL11.GL_TEXTURE_MAG_FILTER, GL11.GL_LINEAR);
		GL11.glTexImage2D(GL11.GL_TEXTURE_2D, 0, GL11.GL_RGBA, textureWidth, textureHeight, 0, GL11.GL_RGBA, GL11.GL_UNSIGNED_BYTE, imageData);
		
		// Create image (either resized by copying, else directly from IL)
		if (textureWidth != width || textureHeight != height) 
			texture = new QTexture(oglImageHandle, width, height, (width / (float) textureWidth), (height / (float) textureHeight), textureWidth, textureHeight);
		else
			texture = new QTexture(oglImageHandle, width, height);
		
		// delete Image in DevIL
		scratch.put(0, ilImageHandle);
		IL.ilDeleteImages(scratch);
		IL.destroy();
		ILU.destroy();
		ILUT.destroy();
		
		// revert the gl state back to the default so that accidental texture binding doesn't occur
		GL11.glBindTexture(GL11.GL_TEXTURE_2D, 0);
		
		// return OpenGL texture handle
		return texture;
	}
	public static void drawImage( QTexture tex, int x, int y )
	{
		if( color[3] == 0 )
			return;
		// store the current model matrix
		GL11.glPushMatrix();
		
		// translate to the right location and prepare to draw
		GL11.glTranslatef(x, y, 0);
		GL11.glColor4f( color[0], color[1], color[2], color[3] );

		if( ra != 0 && ra != 360 )
		{
			float rx1 = rx - x;
			float ry1 = ry - y;
			GL11.glTranslatef(rx1, ry1, 0);
			GL11.glRotatef( ra, 0, 0, 1 );
			GL11.glTranslatef(-rx1, -ry1, 0);
		}
		GL11.glEnable(GL11.GL_TEXTURE_2D);
		GL11.glBindTexture(GL11.GL_TEXTURE_2D, tex.getTextureId() );
		GL11.glBegin(GL11.GL_QUADS);
		{
			GL11.glTexCoord2f( 0, 0 );
			GL11.glVertex2f( 0, 0 );
			
			GL11.glTexCoord2f( tex.getWidthRatio(), 0 );
			GL11.glVertex2f( tex.getWidth(), 0 );
			
			GL11.glTexCoord2f( tex.getWidthRatio(), tex.getHeightRatio() );
			GL11.glVertex2f( tex.getWidth(), tex.getHeight() );
			
			GL11.glTexCoord2f( 0, tex.getHeightRatio() );
			GL11.glVertex2f( 0, tex.getHeight() );
		}
		GL11.glEnd();
		GL11.glBindTexture(GL11.GL_TEXTURE_2D, 0 );
		GL11.glDisable(GL11.GL_TEXTURE_2D);
		// restore the model view matrix to prevent contamination
		GL11.glPopMatrix();
	}

Guess#1

IL.ilConvertImage(IL.IL_RGBA, IL.IL_BYTE); <- Remove that line.

Btw you can resample a texture with ILU.iluScale(new_width, new_height, IL.ilGetInteger(IL.IL_IMAGE_DEPTH)) and you get the best quality with ILU.iluImageParameter(ILU.ILU_FILTER, ILU.ILU_SCALE_LANCZOS3).

Guess#2

Check the driver settings.

Hmm, no such luck. I tried both suggestions, reinstalled nVidia drivers as well.
Well, it’s not a big deal; But if you guys have any other guesses, please throw 'em at me :slight_smile: Thanks anyway!

If I ever find the issue I’ll post it here for reference.

Nvidia does have a setting to force textures into 16 bit in the display control panel.
And are you sure you’ve got a 32bpp context? System.out.println(Display.getDisplayMode()) to be sure, after creation.

Cas :slight_smile:

EDIT:

Wow, I blame nVidia totally =D

I changed some driver settings, namely ‘Image Quality’ :), and it magically works. Thanks for that recommendation!

* oNyx points at guess #2

Told ya :stuck_out_tongue: :wink:

I think the command ilutEnable(ILUT_OPENGL_CONV); may force nVidia cards to use correct texture format.

See the entry on ILUT_OPENGL_CONV in http://openil.sourceforge.net/tuts/tut_10/

-Sam

Interesting, although I wouldn’t want to disable anything on their hardware, so I’ll leave it how it is now. Thanks though :wink: