LWJGL: how to use procedurally generated BufferedImages

Big picture: I’m in the process of getting more acquainted with LWJGL. My hope is that by using it as a means of accessing OpenGL, I can improve the efficiency of my game’s graphics, leaving more cpu available for audio processing.

Immediate question:
Most of my images are procedurally generated. So far, all the tools and tutorials I’ve run across for displaying images assumes that a resource originates from a file. Can someone point me to the operation, or a tutorial for using a Java BufferedImage as a starting point for a Texture? I’m assuming I just overlooked this, or haven’t run across the right tutorial yet.

Thought occurs to me as I write: I recall reading about stippling lines, and that there is also 2D stippling. Is that the route one takes? In other words, should I look at converting my graphic data to stippling data format?

Its just a little function you need:


  public static Texture getTexture(BufferedImage bufferedImage, int target, int dstPixelFormat, int minFilter, int magFilter, boolean mipmap) throws IOException 
    { 
        int srcPixelFormat = 0;
        
        // create the texture ID for this texture 
        int textureID = createTextureID(); 
        Texture texture = new Texture(target,textureID); 
        
        // bind this texture 
        GL11.glBindTexture(target, textureID); 
  
        texture.setWidth(bufferedImage.getWidth());
        texture.setHeight(bufferedImage.getHeight());
        
        if (bufferedImage.getColorModel().hasAlpha()) {
            srcPixelFormat = GL11.GL_RGBA;
        } else {
            srcPixelFormat = GL11.GL_RGB;
        }

        // convert that image into a byte buffer of texture data 
        ByteBuffer textureBuffer = convertImageData(bufferedImage,texture); 
        
        if (target == GL11.GL_TEXTURE_2D) 
        { 
            GL11.glTexParameteri(target, GL11.GL_TEXTURE_MIN_FILTER, minFilter); 
            GL11.glTexParameteri(target, GL11.GL_TEXTURE_MAG_FILTER, magFilter); 
        } 
 
        glTexImage2D(target, 0, dstPixelFormat, get2Fold(bufferedImage.getWidth()), get2Fold(bufferedImage.getHeight()), 0, srcPixelFormat, GL11.GL_UNSIGNED_BYTE, textureBuffer ); 
        if(mipmap){ glGenerateMipmap(GL_TEXTURE_2D); }
       
        
        
        textureBuffer.clear();
        textureBuffer = null;
        bufferedImage.flush();
        bufferedImage = null;
        
        return texture; 
    } 

By default you could use:
int target = GL11.GL_TEXTURE_2D;
int dstPixelFormat = GL11.GL_RGBA;
int minFilter = GL11.GL_LINEAR;
int magFilter = GL11.GL_LINEAR;
boolean mipmap = false;

Image to buffer:


    public static MappedByteBuffer convertImageData(BufferedImage bufferedImage) {
        WritableRaster raster;
        BufferedImage texImage;
        int w = get2Fold(bufferedImage.getWidth());
        int h = get2Fold(bufferedImage.getHeight());
        
        if (bufferedImage.getColorModel().hasAlpha()) {
            raster = Raster.createInterleavedRaster(DataBuffer.TYPE_BYTE,w,h,4,null);
            texImage = new BufferedImage(glAlphaColorModel,raster,false,null);
        } else {
            raster = Raster.createInterleavedRaster(DataBuffer.TYPE_BYTE,w,h,3,null);
            texImage = new BufferedImage(glColorModel,raster,false,null);
        }
            
        // copy the source image into the produced image
        Graphics g = texImage.getGraphics();
        g.setColor(new Color(0f,0f,0f,0f));
        g.fillRect(0,0,w,h);
        g.drawImage(bufferedImage,0,0,null);
        
        // build a byte buffer from the temporary image 
        // that be used by OpenGL to produce a texture.
        byte[] data = ((DataBufferByte) texImage.getRaster().getDataBuffer()).getData(); 
        
        MappedByteBuffer imageBuffer = (MappedByteBuffer)MappedByteBuffer.allocateDirect(data.length); 
        imageBuffer.order(ByteOrder.nativeOrder()); 
        imageBuffer.put(data, 0, data.length); 
        imageBuffer.flip();
            
        return imageBuffer; 
    }

Power of 2 function:

private static int get2Fold(int fold) {
        int ret = 2;
        while (ret < fold) {
            ret *= 2;
        }
        return ret;
    } 

*its faster (very much) to skip the bufferedimage and write directly to a bytebuffer.

More info:


But if you’re doing per-pixel operations in OpenGL you are probably better off working with shaders:

@davedes - This is a cool tutorial, lots of great info in it! I’ve been making use of your Tutorials and of lwjgl-basics. Thanks for writing all this and making it available. It just wasn’t aimed at this specific question, since that example loads a png instead of making use of an existing BufferedImage.

@RobinB - Awesome! It will take me a bit to make time and digest what you wrote, but I should be able to jump in and see what I can make of it before the weekend.

Got it from the LWJGL demo and adjusted it a bit.
Just copy / paste for usage, and find out later what it does :smiley:

@davedes -

Actually, I am having some troubles trying to run code from lwjgl-basic.

For example, from the tutorial you linked, there is code for class “Texture”. I can see it in my Eclipse load of lwjgl-basic, in the package “mdesl.graphics”.

It doesn’t have a main(), but I can see where other classes in your project call it. From “mdesl.test” there is “RectTest.java”. However, when I try to run RectTest, I get the following error:

Exception in thread "main" java.lang.RuntimeException: couldn't decode texture
	at mdesl.test.RectTest.create(RectTest.java:56)

The try/catch where this is thrown follows:

			fontTex = new Texture(Util.getResource("res/ptsans_00.png"), Texture.NEAREST);
			
			//in Photoshop, we included a small white box at the bottom right of our font sheet
			//we will use this to draw lines and rectangles within the same batch as our text
			rect = new TextureRegion(fontTex, fontTex.getWidth()-2, fontTex.getHeight()-2, 1, 1);
			
			font = new BitmapFont(Util.getResource("res/ptsans.fnt"), fontTex);
	

Perhaps there is a problem with the font? When I tried to run “FontTest.java” from the same package, I get this error:

java.io.IOException: data for lineHeight is corrupt/missing: 

I don’t know much about fonts, but here is what comes up when I do a text display of the font file you supply with the project (limited to first five lines):

info face=PT Sans size=15 bold=0 italic=0 charset="" unicode=1 stretchH=100 smooth=0 aa=1 padding=0,0,0,0 spacing=1,1
common lineHeight=21 base=16 scaleW=256 scaleH=256 pages=1 packed=0
page id=0 file=ptsans_00.png
chars count=338
char id=32 x=55 y=161 width=1 height=1 xoffset=-1 yoffset=15 xadvance=4 page=0 chnl=0

It looks to me like “lineHeight” is there in the font file.

Any thoughts? I can learn a lot from your tutorials regardless, but it is always reassuring to be able to get the code running.

@ RobinB , re:[quote]Got it from the LWJGL demo and adjusted it a bit.
[/quote]
May I ask which LWJGL tutorial? So far, the only thing I’ve found with a reference to what seems to be one of the key functions (glTexImage2D) is the “Helper Libraries” section, last link:
http://www.lwjgl.org/wiki/index.php?title=Loading_PNG_images_with_TWL’s_PNGDecoder

May I ask which LWJGL tutorial? So far, the only thing I’ve found with a reference to what seems to be one of the key functions (glTexImage2D) is the “Helper Libraries” section, last link:
http://www.lwjgl.org/wiki/index.php?title=Loading_PNG_images_with_TWL’s_PNGDecoder
[/quote]
Actually i have no clue, im using it so long now.
It could be from slick.

Hopefully the knowledge from the tutorial will help you to upload generic byte data. There is no need for a BufferedImage (unless you also plan to display the image with Java2D). It just causes another step of copying pixel data.

Instead, you should upload the data directly to the texture as a ByteBuffer (or you can implement IntBuffer). lwjgl-basics includes a utility for this:
[icode]Texture.upload(int format, ByteBuffer data)[/icode]

Here is a trivial example of using a software shader:

Obviously it would perform many times faster if you were to use a hardware shader. Generally you should only be uploading pixel data per frame as a very last resort.

[quote]Perhaps there is a problem with the font? When I tried to run “FontTest.java” from the same package, I get this error:
[/quote]
Thanks for testing. I can’t reproduce the error on my end here. Have you edited the file at all? Or opened it and maybe saved it by accident with a different charset/line endings?

That exception could be thrown, for example, if there is an empty line at the beginning of the file. (The font parser is not as robust as LibGDX or other frameworks.)

I forked “lwjgl-basics” to my account in github, then cloned that to my c: (windows xp OS), then made the clone the source of the Eclipse project (Juno).

afaik, I didn’t open up any files except in the context of eclipse, and certainly didn’t edit any of the resources. All resources bear the same creation and modification date (presumably the clone time).

I looked at the .fnt file in Notepad, and there is no blank line at the start. The text is as I showed earlier.

I’ll put a copy of the files I feel might be dubious on the following URL, if you want to check them for corruption.

http://www.hexara.com/lwjgl_basics_res/ptsans.fnt
http://www.hexara.com/lwjgl_basics_res/ptsans_00.png
http://www.hexara.com/lwjgl_basics_res/ptsans_00_atlas.png

I included the two .png files because they aren’t ‘normal’ images. Maybe they are correct. Here’s what displays for ptsans_00.png (hard to see the white chars over the grey background):

http://www.hexara.com/lwjgl_basics_res/ptsans_00.png

Are these files okay? If it is a matter of decoding the .fnt file correctly…Microsoft OS “properties” just says “Opens with: Unknown application” – no info as to char encoding. Is there a setting in Eclipse I can report back to you on how the file is being read?

Not sure why it is happening, I will have to look into it. In the mean time, you don’t need any fonts to work with procedural Textures (or shaders for that matter) so you can just run the tests that don’t include bitmap fonts. :slight_smile:

@davedes – Am curious what you uncover! I just did a test that confirms the crash is on the font, not the png’s that come in the try–catch before the font call.

Unfortunately, my computer doesn’t support shaders, so the program won’t run anyway, even if we get the font file read. In fact, it seems all your tutorials and programs in lwjgl-basic require shaders. Dang. I won’t be upgrading my graphics for a few months probably.

Meanwhile, just saw that the “red book” has a later chapter that discusses Images in some detail. That will hopefully give additional background that will help. Did a review of ByteBuffer today, too. I have a lot of background to fill in.

Onward…

The code I posted is a “software shader” – so it should work. It just changes byte data and uploads it to a texture every frame. :slight_smile: It will at least get you introduced to the concept of fragment shaders – i.e. per-pixel operations – even though it isn’t in GLSL.

Also, just curious what computer and GL version are you are running? Lwjgl-basics should be compatible with GL 2.0.

The computer has two Pentium 4’s running at 3.20 GHz, and 3GB RAM.
The Graphics is ATi Catalyst–Adapter: Radeon 7000 Series, says OpenGL version 6.14.10.5819. I don’t know what that number means.

This all needs replacing–hopefully before the end of the year. The error message I get running “SpriteBatchTest” says the following:

Exception in thread "main" org.lwjgl.LWJGLException: no shader support found; shaders require OpenGL 2.0

I wonder if I even have 1.5!? :stuck_out_tongue:

Easy to find out:


        System.out.println("OpenGL version: " + glGetString(GL_SHADING_LANGUAGE_VERSION));
        System.out.println("GLSL version: " + glGetString(GL_VERSION));
        System.out.println("Graphic Card: " + glGetString(GL_RENDERER));

Place the code anywhere to print the info.

add: "import static org.lwjgl.opengl.GL11.;" as import

D’oh… of course my sprite batcher uses a shader. :stuck_out_tongue:

Try this, it uses no shaders at all and should support GL 1.0:

Also, have you tried LibGDX? This would allow you to use a GL 1.0 based engine, and later upgrade to 2.0 without significant changes to your game’s codebase.

P.S. My test will print the GL version, so let us know what you see in the console. :slight_smile:

GLSL version: 1.3.1072 WinXP Release
Graphic Card: RADEON 7000 DDR x86/SSE2

I put RobinB’s code in one of the NeHe Tutorials (#5, spinning 3D pyramid and cube).

I couldn’t find an import to allow this line:

System.out.println("OpenGL version: " + glGetString(GL_SHADING_LANGUAGE_VERSION));

Can’t be late for work, gotta run.
Will be back at it late tonight.
Thanks! :slight_smile:

Jesus… GL 1.3 is 12 years old. :stuck_out_tongue: Have you installed latest drivers? I think you need to upgrade if you plan on writing hardware-accelerated games.

You need to statically import GL20. That line might not work if you’re running 1.3.

That’s the GLSL version. 1.3 = OpenGL 3.0 :slight_smile:

Robin’s code is wrong. Revised code:


System.out.println("OpenGL version: " + glGetString(GL_VERSION));
System.out.println("GLSL version: " + glGetString(GL_SHADING_LANGUAGE_VERSION));
System.out.println("Vendor + Graphic Card: " + glGetString(GL_VENDOR) + " " + glGetString(GL_RENDERER));

Yeah… if we was using Robin’s code it means GL 1.3 a.k.a. no GLSL. :o