Getting / Setting image pixels

Hi, I am trying to dynamically edit an image.
I have put the pixels into a byte buffer and then made a byte array from that.
How can i set and get a single pixel at x,y?

Thanks


	public void GetPixelsFromImage()
	{
		int size = m_iWidth*m_iHeight*16;
		GL11.glBindTexture(GL11.GL_TEXTURE_2D, m_iTextureID);
		m_bbPixels = ByteBuffer.allocateDirect(size);
		GL11.glGetTexImage(GL11.GL_TEXTURE_2D, 0, GL11.GL_RGBA, GL11.GL_INT, m_bbPixels);
		m_pixels = new byte[size];
		m_bbPixels.get(m_pixels);
	}
	
	public void SetPixels()
	{
		GL11.glBindTexture(GL11.GL_TEXTURE_2D, m_iTextureID);
		GL11.glTexImage2D(GL11.GL_TEXTURE_2D, 0, GL11.GL_RGBA8, m_iWidth, m_iHeight, 0,GL11.GL_RGBA, GL11.GL_INT, m_bbPixels);
	}
	
	public void CopyPixelsToByteBuffer()
	{
		m_bbPixels.put(m_pixels);
	}
	
	public void SetPixel(int x, int y, int rgb)
	{
		///???????
	}

	public int GetPixel(int x, int y)
	{
		///???????
	}


public void setPixel(int x, int y, int rgba){
    int index = (y * width + x) * 4; //4 is for the 4 components
    //put data at index
}

I’m veeeery suspicious of your use of GL_INT…

edit: I was wrong.

Thanks theagentd :slight_smile:

So pixels[index+0] = red (0 - 255?)
+1 = g
+2 = b
+3 = a?

What’s wrong with GL_INT? I stole the code from http://lwjgl.org/wiki/index.php?title=Render_to_Texture_with_Frame_Buffer_Objects_(FBO)

GL_INT is signed. I’d use GL_UNSIGNED_BYTE. If you use bytes instead, the index + <0-3> is correct. If you want to, you can wrap your ByteBuffer in an IntBuffer, and just set a whole pixel with a single command.

Ok, thanks again.
I tried wrapping the ByteBuffer in an IntBuffer, but when trying to set the pixels back with glTexImage2D using the intbuffer instead of the bytebuffer, it said the intbuffer wasn’t big enough ???

You’re supposed to set all channels of a pixel in one command -> one set is one pixel. You also shouldn’t multiply the index by 4, as an int is already four bytes.

Thanks. Ok it’s almost working, but when I get all the pixel colours and write them onto another image (to test that its working), the image turns out blue. Other than that, it’s fine. The shading is correct
Image:

where it is blue it should be brown(its a ground texture). Do you have any idea why this could be?


		for (int x = 0; x < width; x++)
		{
			for (int y = 0; y < height; y++)
			{
				int colour = m_texture.GetPixelAt(x, y);
				int r = colour % 256;
				int g = (int)(colour / 256) % 256;
				int b = (int)((colour / 256) / 256) % 256;
				int a = (int)(((colour / 256) / 256)/256) % 256; //doesnt get the right alpha either
				
				GL11.glColor3b((byte)r,(byte)g,(byte)b);
				Graphics.DrawSquare(x,y, 1, 1);
			}
		}

How about a cleaner solution:


byte r = (byte)((colour >>> 24) & 0xff);
byte g = (byte)((colour >>> 16) & 0xff);
byte b = (byte)((colour >>> 8) & 0xff);
byte a = (byte)(colour & 0xff);

Thanks for the reply ra4king :slight_smile:
As well as being cleaner, that fixed most of it. Although the alpha is 0 at every pixel when it should be 255 and the pixels are too bright. It must have something to do with my offscreen rendering code. I just have no idea what ???

That’s because byte values only go from -128 to 127. However, I do believe OpenGL correctly translates that to 0 - 255.

EDIT: wait…255 should be -1, how do you get 0?

Im just guessing. Some pixels are very transparent, the rest you can’t see at all