I’m porting my Java2D Isometric engine to LWJGL, and am considering cutting diamond-shaped tiles from a bigger texture using glTexCoord2f and glVertex2f. In Java2D I just use drawimage with a rectangle containing the tile - the rest of the rectangle being transparent. This is a lot of wasted space and if I’m drawing diamond-shaped quads I can also take advantage of glColor3f on the vertexes to create a dynamical lighting.
With my original test-image:(scaled up)
http://www.javaengines.dk/test/testoriginal.gif
I got the following result:
http://www.javaengines.dk/test/testfail.gif
using this code:
GL11.glBegin(GL11.GL_QUADS);
{
GL11.glTexCoord2f(0, 8.0f/tex.getImageHeight());
GL11.glVertex2f(0, 8);
GL11.glTexCoord2f(16.0f/tex.getImageWidth(),16.0f/tex.getImageHeight());
GL11.glVertex2f(16,16);
GL11.glTexCoord2f(32.0f/tex.getImageWidth(),8.0f/tex.getImageHeight());
GL11.glVertex2f(32, 8);
GL11.glTexCoord2f(16.0f/tex.getImageWidth(), 0.0f);
GL11.glVertex2f(16, 0);
}
GL11.glEnd();
Problem: The diamond being cut was being shifted 1 pixel to the right.
So I subtracted 1 from all x-values:
GL11.glBegin(GL11.GL_QUADS);
{
GL11.glTexCoord2f((0.0f-1)/tex.getImageWidth(), 8.0f/tex.getImageHeight());
GL11.glVertex2f(-1, 8);
GL11.glTexCoord2f((16.0f-1)/tex.getImageWidth(),16.0f/tex.getImageHeight());
GL11.glVertex2f(15,16);
GL11.glTexCoord2f((32.0f-1)/tex.getImageWidth(),8.0f/tex.getImageHeight());
GL11.glVertex2f(31, 8);
GL11.glTexCoord2f((16.0f-1)/tex.getImageWidth(), 0.0f/tex.getImageHeight());
GL11.glVertex2f(15, 0);
}
GL11.glEnd();
And got the following result, exactly what I wanted:
http://www.javaengines.dk/test/testsuccess.gif
Now the million dollar question is: Can I trust this tile cutting to work pixel perfect on all cards/systems or is the exact set of pixels defined by a quad somewhat undefined?

