Texture co-ords

I’m just learning opengl (LWJGL), having an issue.

glMatrixMode(GL_PROJECTION);
        glLoadIdentity();
        glOrtho(0,width,0,height,1,-1);
        glMatrixMode(GL_MODELVIEW);
        glEnable(GL_TEXTURE_2D);

        glClearColor(0,0,0,1);

(width = 800, height = 600)

and then the drawing code

glBegin(GL_QUADS);
                glTexCoord2f(0,1);
                glVertex2f(100,200);
                glTexCoord2f(0,0);
                glVertex2f(100,100);
                glTexCoord2f(1,0);
                glVertex2f(300,100);
                glTexCoord2f(1,1);
                glVertex2f(300,200);
         glEnd();


The texture I am attempting to draw (http://lwjgl.org/webstart/logo.png) is showing upside down. Is there a particular order I need to draw in? I thought it was just anticlockwise, and that only mattered in 3d to make it face you. 

Also, the texture isnt being draw fully, bits are missing around one edge (http://img3.imageshack.us/img3/8758/problemcy.jpg). I've refilled the image with white just to check I hadn't accidentally enabled some sort of transparency in opengl.

The origin in OpenGL is bottom-left, not top-left.

Change:

glOrtho(0,width,0,height,1,-1);

To:

glOrtho(0,width,height,0,1,-1);

to put the origin top-left.

I was having the same problem in this thread.http://www.java-gaming.org/topics/slick2d-problem-getting-an-image-size-making-a-grid/26115/msg/227664/view.html

[quote]The texture I am attempting to draw (http://lwjgl.org/webstart/logo.png) is showing upside down. Is there a particular order I need to draw in? I thought it was just anticlockwise, and that only mattered in 3d to make it face you.
[/quote]
When you draw points you should start with the top left corner and then go clockwise like this:
1----------2
| |
| |
4----------3

[quote]Also, the texture isnt being draw fully, bits are missing around one edge (http://img3.imageshack.us/img3/8758/problemcy.jpg). I’ve refilled the image with white just to check I hadn’t accidentally enabled some sort of transparency in opengl.
[/quote]
Check out this page on drawing images with slick-util: http://slick.javaunlimited.net/viewtopic.php?p=25911#p25911

Glad I could help,
Longarmx

Edit: Or you could always do what Riven said and make your coordinate system different

There is no need to go clockwise or counter-clockwise in this case.

Please also keep in mind that OpenGL by default only works ‘as expected’ with image data with dimensions that are power-of-two. Your image is 500x236 and therefore will cause rendering artifacts.

Unless you’ve enabled backface culling, which is not common in a 2D app, the winding doesn’t matter, though it’s always best to keep it consistent.

There are two things to consider here: many 2D-oriented tutorials use a top-left origin where Y increases downward on the screen, while others (including all the 3d ones) use a center origin where Y increases upward. If you ever mix code from tutorials, then you’ll have to make sure you adapt to whatever projection the other was using.

Secondly, due to some obscure historical artifact, OpenGL loads image data upside-down. While most texture loaders compensate for this and do what you expect, some people do have images pre-flipped, so most decent texture loaders include a flag in their API to let you flip the image you’re loading. Check out the Slick2D TextureLoader API for specifics, assuming that’s what you’re using. Just keep in mind it flips the actual image data, so it effectively inverts your texture coords too.

I hated switching from bottom left graphs in maths to top left for java2D so am intentionally moving it to bottom left.

I completely forgot about the power of 2 business, sorted the issue. Also I added the ‘flip’ argument of true to the slick texture loader and that is also fixed.

Thanks for the quick replies guys.

I’m starting to realize that I’m the only one who uses non-power of two textures. Just imagine it was actually true that graphics cards could only use POT-textures. That would mean that a 1920x1080 render target, a 100% normal texture, would need to be upsized to 2048x2048. What would this imply for deferred rendering? 2,02x increased memory usage. With 4 16-bit float RGBA render targets + a depth+stencil buffer, you’re already pushing (244 + 4) = 36 bytes per pixel. For a 1920x1080 framebuffer, that’s already 71.2MBs. Add 4x MSAA and you’ve got 284,8MBs just for the framebuffer. You’re telling me that I have to increase this to 576,0MBs just to make those textures PO2?

Source: http://feedback.wildfiregames.com/report/opengl/feature/GL_ARB_texture_non_power_of_two
Tell me that you’ve ever seen a card that doesn’t support it.

“But performance is lower!”

EDIT: Sorry to keep pestering you guys with things you obviously don’t want to hear
2560x1440 --> 4096x2048 x 36bpp x 4xMSAA = 1152MBs instead of 506,25MBs. Last time I checked, BF3 also has, you know, textured surfaces.

I’m fully aware of non-POT extensions in OpenGL. If you carefully read what I said (and what you quoted) I said ‘by default’ and ‘as expected’.

Did I want to go into details about non-POT to solve this problem that an ‘opengl newbie’ was running into? No, because it would only confuse him. Sometimes it’s better to suggest a solution that ‘borderline correct’ instead of ‘absolutely true’, as getting it to work at this stage is more important than doing it in the most efficient way possible.

The fact that he uses Slick to load a non-POT texture, which turns into a POT texture with only a sub-area filled with data, doesn’t help either, but that’s not drastically important in this case.

I think your tone is a bit condescending and I encourage you not to.

Sorry, I realize that that rant was at least unfitting for this particular thread. I just have a severe case of this at times…

FWIW: I like to point out that I was not wrong in my advice, and have clarified why: emphasis on ‘by default’.

Off topic again, sorry.

Again, where is the proof? The only problem I’ve ever had with non power of 2 textures was when the textures size wasn’t a multiple of 4 since the default pack/unpack alignment is 4 pixels. Is that what you mean?

What OpenGL supports ‘by default’ -> ‘without using extensions’

The NPOT extension is “invisible” in that It Just Works for functions that previously demanded POT textures. If you read the extension spec it spells that out.

ISTR that Slick’s Texture class does make a few POT assumptions though, and there are platforms that will absolutely hate you for using NPOT textures even if they technically support them. iOS for example.

Hence my original advice. We’re dealing with Slick here, which allocates a POT texture for a non-POT image => it’s preferable to keep it simple. Scale the image to POT before you upload it to the GPU, and it works. After that, the original-poster can ditch slick’s textureloader, write his own textureloader and explore non-POT, but if I’d suggested that, it still wouldn’t have worked.

Can’t we all just get along?

It’s core in OpenGL 2.0 though. That’s “default OpenGL” in my dictionary at least, or do you guys keep a DirectX 8 codebase for your DirectX games? :wink:

Anyway, I’ve made my point so let’s just drop this and be friends.