Hi all
I’m trying to get a texture image from the GPU back into a BufferedImage, but I can’t get it right. This is the code:
BufferedImage bkgImage = new BufferedImage(256,256, BufferedImage.TYPE_INT_ARGB);
//load image data...
DataBufferInt buffer = (DataBufferInt) bkgImage.getRaster().getDataBuffer();
gl.glShadeModel(GL.GL_FLAT);
gl.glPixelStorei(GL.GL_UNPACK_ALIGNMENT, 1);
gl.glBindTexture(GL.GL_TEXTURE_2D, backgroundTexturePointerID);
gl.glTexEnvf(GL.GL_TEXTURE_ENV, GL.GL_TEXTURE_ENV_MODE, GL.GL_DECAL);
gl.glTexParameterf(GL.GL_TEXTURE_2D, GL.GL_TEXTURE_WRAP_S, GL.GL_CLAMP_TO_EDGE);
gl.glTexParameterf(GL.GL_TEXTURE_2D, GL.GL_TEXTURE_WRAP_T, GL.GL_CLAMP_TO_EDGE);
gl.glTexParameterf(GL.GL_TEXTURE_2D, GL.GL_TEXTURE_MAG_FILTER, GL.GL_LINEAR);
gl.glTexParameterf(GL.GL_TEXTURE_2D, GL.GL_TEXTURE_MIN_FILTER, GL.GL_LINEAR);
gl.glTexImage2D(GL.GL_TEXTURE_2D, 0, 4, bkgImage.getWidth(), bkgImage.getHeight(), 0,
GL.GL_BGRA, GL.GL_UNSIGNED_INT_8_8_8_8_REV, buffer.getData());
gl.glBindTexture(GL.GL_TEXTURE_2D, backgroundTexturePointerID);
bkgImage.flush();
bkgImage = null;
int[] pixels = new int[256*256];
gl.glGetTexImage(GL.GL_TEXTURE_2D,0,GL.GL_BGRA, GL.GL_UNSIGNED_INT_8_8_8_8_REV,pixels);
BufferedImage image = new BufferedImage(256,256, BufferedImage.TYPE_INT_ARGB);
image.getRaster().setSamples(0,0,256,256,0,pixels);
JLabel l = new JLabel(new ImageIcon(image));
JFrame f = new JFrame();
f.setLayout(null);
l.setBounds(0,0,256,256);
l.setBorder(BorderFactory.createLineBorder(Color.yellow));
f.add(l);
f.setSize(300,300);
f.setVisible(true);
The above code should show the same image that was loaded in the first place and was used for the texture. Instead, I get all kinds of strange results, depending on the pixel format and type I use when calling glGetTexImage.
a) Should pixel format and type when calling the glGetTexImage be the same with the ones used when specifying the texture with glTexImage, or there is an auto conversion between different formats?
b) What happens when other pixel formats are used? For example, if the pixel format for the texture is BGR, and type is GL_UNSIGNED BYTE, how can one set the byte[] of pixels to a bufferedImage? setSamples can’t use byte arrays, so does an int array have to be created from the byte array?
c) is it possible to create a texture from other existing texture data, without getting all the texture images back to CPU/memory, combine them and then put them in a new texture?
(Sorry about the too many questions)
TIA
N