[LWJGL] 2D overlay - changing y-axis direction

Topic: implementing a 2D GUI overlay.

When switching to 2D drawing operations, I set up the viewing matrix with “gl.ortho(0, width, 0, height, -1, 1)”. The problem with this is that OpenGL’s coordinate system will have the origin in the bottom left corner of the screen, and have the y-axis increasing upwards. This goes against “traditional” 2D graphics systems, so I’m trying to change it.

First I tried setting up the viewing transform as “gl.ortho(0, width, height, 0, -1, 1)”, but
front-facing polygons are now back-facing and don’t get drawn. Similar tricks with “gl.scaled(1, -1, 1) ; gl.translate(0, -height, 0)” or “gl.translated(width/2, height/2, 0) ; gl.rotated(180, 0, 1, 0) ; gl.rotated(180, 0, 0, 1) ; gl.translated(-width/2, -height/2, 0)” produce the same effect.

Is there any way to change the direction of the y-axis without changing the facing direction of the polygons?

Figuring there was no way to do this, I looked at ways to fix the problems it causes. You can disable/change culling, change the winding direction to GL.CW, draw your polygons the “wrong way round”, or not perform the y-axis switch and just put up with the odd coordinate system OpenGL provides.

What do you think is the “proper” solution to this problem?

The ‘proper’ solution to this is to think like a mathematician and have 0,0 as the bottom left coordinate :slight_smile: After a while it becomes natural.

Failing that use the gl.ortho trick but gl.disable(GL.CULL_FACE) which stops the faces being culled.

Cas :slight_smile:

Yeah, that’s what I thought. I’m trying to get things correct in my head, as there seems to be no “cute” solution to the problem. I checked out the source code for GLUI and it seems they just draw everything backwards… Quitters!

I’m pulling a few tricks to get all the pixels to line up correctly, and they still don’t want to play ball. It’s playing hell with my eyes peering at my screen trying to work out what pixels aren’t behaving this time…

Grrr… >:(

OpenGL really doesn’t like doing pixel-perfect primitives, does it? For some reason, my OpenGL implementation doesn’t appear to be working quite to spec. Although, there is a bit of wiggle room here and there so maybe it’s being correct, just unhelpfully so.

Too tired right now, I’ll tackle this tomorrow. :-/

No problems here on Nvidia drivers. Pixel perfect every time. Although I do recall some strangeness about adding 0.35f to coordinates for something but I don’t bother with it.

Cas :slight_smile:

Yeah, you can gl.translatef(0.375f, 0.375f, 0.0f) to adjust the grid slightly with respect to window coordinates. Points can then be specified in integer coordinates (no rounding problems when they are truncated) and start and end points of lines fall within the inscribed diamonds. It’s a good trick (theoretically) but until I work out what’s going wrong I’m avoiding it - one less thing to go wrong!

This is the problem I’m having:

B |\ | \ | \ A C

Whether I draw a LINE_LOOP of ABC or a LINES of ABBC, the pixel at B is not drawn. The line AB shouldn’t draw it because it doesn’t exit the inscribed diamond. The line BC should draw B (but not C), but it appears not to draw B at all. Still not sure if it’s my dodgy code, my dodgy graphics drivers or my dodgy hardware at the moment. Quite infuriating!

(I’m currently trying to work out whether the implementation is allowed to miss out this pixel if one line is in y-major and the other is in x-major. Straw grasping ahoy!)