LWJGL white color instead of texture ?

Hi, I’m testing and learning LWJGL.
I have been working on screen class that will handle all the rendering. And I’m trying to provide it with sprite and take texture from sprite and render quad with texture.

Now I want to render the sprite, and then perform simple fill method, to make colored quad. But, if I do that then both the sprite, and quad are just fills. If I don’t perform fill sprite renders properly. I don’t know why is this happening.

Here’s screen rendering :

	public void render(Sprite sprite, double x, double y) {
		sprite.texture.bind();

		double xx = x + xOffs;
		double yy = y + yOffs;

		double tW = sprite.texture.getWidth() / 8.0;
		double tH = sprite.texture.getWidth() / 1.0;

		System.out.println(sprite.h);

		GL11.glBegin(GL11.GL_QUADS);
		{
			GL11.glTexCoord2d(tW * 0, 0);
			GL11.glVertex2d(xx, yy);

			GL11.glTexCoord2d(tW * 0, 1);
			GL11.glVertex2d(xx, yy + sprite.h);

			GL11.glTexCoord2d(tW * 1, 1);
			GL11.glVertex2d(xx + sprite.w, yy + sprite.h);

			GL11.glTexCoord2d(tW * 1, 0);
			GL11.glVertex2d(xx + sprite.w, yy);
		}
		GL11.glEnd();

		sprite.texture.release();
	}

Screen fill :

	public void fill(double x, double y, double ww, double hh, int color) {
		double r = (double) (color & 0xff0000) / (double) 0xff0000;
		double g = (double) (color & 0xff00) / (double) 0xff00;
		double b = (double) (color & 0xff) / (double) 0xff;

		double xx = x + xOffs;
		double yy = y + yOffs;

		GL11.glBegin(GL11.GL_QUADS);
		{
			GL11.glColor3d(r, g, b);
			GL11.glVertex2d(xx, yy);
			GL11.glVertex2d(xx, yy + hh);
			GL11.glVertex2d(xx + ww, yy + hh);
			GL11.glVertex2d(xx + ww, yy);
		}
		GL11.glEnd();
	}

Why are you calling texture.release? You should only release textures when they are no longer needed.

Also what is with the texture width/height ratio? See this post for proper implementation.

Some other things: Specify the color white (opaque) before rendering your sprite. Slick’s texture.bind() will enable texturing (glEnable) and bind the texture (glBindTexture). When you want to render something with textures disabled (e.g. like a white rectangle) you should first call TextureImpl.bindNone() to clear the texture state (glDisable).

   public void fill(double x, double y, double ww, double hh, int color) {
      double r = (double) (color & 0xff0000) / (double) 0xff0000;
      double g = (double) (color & 0xff00) / (double) 0xff00;
      double b = (double) (color & 0xff) / (double) 0xff;

      double xx = x + xOffs;
      double yy = y + yOffs;

+     TextureImpl.bindNone();
      GL11.glBegin(GL11.GL_QUADS);
      {
         GL11.glColor3d(r, g, b);
         GL11.glVertex2d(xx, yy);
         GL11.glVertex2d(xx, yy + hh);
         GL11.glVertex2d(xx + ww, yy + hh);
         GL11.glVertex2d(xx + ww, yy);
      }
      GL11.glEnd();
   }

Thanks for info.

I’m using texture.getWidth() / 8.0 to get the tile size, but like from 0 to 1.

I didn’t read everything, but you should make sure that you draw everything AFTER you initialize your textures. This thing happened to me today. :smiley: