Hi everybody,
We are trying to display a bitmap using JOGL.
Currently we are trying to use the drawPixels method for this.
We use the arguments ‘GL_RGB’ and ‘GL_UNSIGNED_BYTE’, and pass a java byte array.
As an example we are trying to draw a 10x10 pixels bitmap. We initialize a java byte array of 10x10x3 bytes (3 bytes for each pixel, R,G,B. ). And pass this array to the function, with width and height both set to 10.
If in the array all the red bytes are set to 255, and the remaining to 0, the first line becomes red, and every next line is blue, green, red, etc.
After some tinkering we found out that if we add an extra offset of 2 bytes to each line, the function draws a nice 10x10 red square. Important to note is that this gives every row a size of exactly 32 bytes, so a multiple of 4.
Now I know that often computers line out things on 4 byte fields, but in this case it’s a bit confusing, and not to mention not documented.
Can someone please explain why this is happening, and what would be the best way to get around it?
Manually adding two empty bytes to each row is asking for bugs, it’s a bit too much like pointer ahritmics which we try to avoid in Java
is there some function to convert our byte array to a properly aligned one, or is this just a bug ?
If any more explanation is needed just ask
Many thanks in advance,
Theo Odink and Jeroen Cranendonk