I’ve been stuck on this all day. Googling did nothing, neither did the LWJGL wiki.
I have different tiles that will be displayed with multiple textures. Those textures are loaded into a HashMap, and they load just fine. However, when I attempt to display these tiles, they all use the same texture. It’s odd to me, because they all use the last loaded texture.
Let’s say I initialize the textures here. Whichever line of code is the last, that texture will be used for everything. The only way I can tell them apart is because I also have the tiles colored as well. So right now, all my textures in the game are stone.
texturefile.LoadTexture("textures/water.png", "water");
texturefile.LoadTexture("textures/metal.png", "metal");
texturefile.LoadTexture("textures/stone.png", "stone");
Here’s the draw method for each tile. Is there something I am missing here? Or is it in the initialization?
public void paint(int camx, int camy){
GL11.glBindTexture(GL11.GL_TEXTURE_2D, texturefile.getTexture(type.name().toLowerCase()).getTextureID());
GL11.glColor3f(R*0.6f, G*0.6f, B*0.6f);
GL11.glVertex2d(l.getX()+camx-offset, l.getY()+camy-offset);
GL11.glVertex2d(l.getX()+camx+offset, l.getY()+camy-offset);
GL11.glVertex2d(l.getX()+camx+offset, l.getY()+camy+offset);
GL11.glVertex2d(l.getX()+camx-offset, l.getY()+camy+offset);
GL11.glTexCoord2f(0, 0);
GL11.glVertex2d(l.getX()+camx-offset+1, l.getY()+camy-offset+1);
GL11.glTexCoord2f(1, 0);
GL11.glVertex2d(l.getX()+camx+offset-1, l.getY()+camy-offset+1);
GL11.glTexCoord2f(1, 1);
GL11.glVertex2d(l.getX()+camx+offset-1, l.getY()+camy+offset-1);
GL11.glTexCoord2f(0, 1);
GL11.glVertex2d(l.getX()+camx-offset+1, l.getY()+camy+offset-1);
}
My game is fairly simple as far as how it’s displayed. I think once I understand loading textures, things will be a lot easier. Any help would be great, thanks for reading!