[SOLVED] LWJGL - Draw section of texture

I’m new to LWJGL, and wonder how to draw a section of an image for example a spritesheet, in a 2d game. I have been browsing the internet and only found that you can have half numbers to

glTexCoord2f(0.5f, 0.5f);

.
Also does the textures dimensions need to be the power of 2, and can you draw a part 12x14 in dimensions of a texture.

Thanks in advance.

You need the height/width of the sheet, the height/width of each image and which image you want.


glTexCoord2f(                                                 //  0 0
    1/sheetWidth * whichImageX, 
    1/sheetHeight * whichImageY); 
glTexCoord2f(                                                 //  1 0
    1/sheetWidth * whichImageX + imageWidth, 
    1/sheetHeight * whichImageY);
glTexCoord2f(                                                 //  1 1
    1/sheetWidth * whichImageX + imageWidth, 
    1/sheetHeight * whichImageY + imageHeight); 
glTexCoord2f(                                                 //  0 1
    1/sheetWidth * whichImageX, 
    1/sheetHeight * whichImageY + imageHeight); 

Some hardware can handle non-power of two (pot) texture dimensions, some cannot. You can check but this would mean you would need to have two versions of each texture and it is an effort so frankly, when you can stitch several npot images into 1 pot texture and scale like there’s no tomorrow, what is the point?

NB. I think that a lot of hardware that does support it would suffer at least a slight performance drop do again, what is the point?

NB2 Be careful with OpenGL texture y-coords. 0 is the top of the texture and 1 is the bottom. Not friendly but convention is older than time itself (no it’s just OpenGL messing with you)

All of this and more should be explained in my Textures tutorial:

Let me know if you have more questions. :slight_smile:

glTexCoord2f is deprecated, isn’t it?

Why you don’t try learning ‘modern’ OpenGL? And why so many others just still continue using the static pipeline? I think OpenGL 3.x is worth learning it.

OpenGL 3.x isn’t supported enough yet.

OpenGL 2.x is currently the best target.

Maybe the best target for compatibility reasons, but for high performance OpenGL 3.x or above is needed! In my mind OpenGL =< 2.x is only good for small scenes or so called demos, to show a part of something…

You do realise OpenGL 2.x has VBOs, Shaders, FBOs etc. right?

The only notable upgrades in 3.x are VAOs and in/out shaders. (and FBOs added to the core)

Guardian II uses OpenGL 2.x

FBOs are supported in 93% of cards. If the card supports shaders, it most likely supports FBOs. VAOs are supported in at least 84% of cards.

http://feedback.wildfiregames.com/report/opengl/

[quote]but for high performance OpenGL 3.x or above is needed
[/quote]
Not at all true…

jhffvcc already PM’d me and we had a discussion on OpenGL 2.x and 3.x

I think he got my point about OpenGL 2.x still being ‘modern’.

Even more off topic:

Despite OpenGL 2.x supposedly being supported on 93% or cards, of the 7* people that wanted to test Guardian II, 2 of them only had graphics cards supporting OpenGL 1.4 :-\

(*7 people that actually responded. 2 never got back to me.)

I have 3 laptops I use constantly and they don’t even support 2.x.

My 2.x compatible desktop died due to air saltiness from where I live.

The only way I could do shaders nowadays is from my droid phone which sadly drains a lot of battery and heats the crap out off my phone.

What’s wrong with learning both pipelines?

And instancing and geometry shaders and non-matching FBO attachments and 32-bit float texture filtering support.