LWJGL/OpenGL not drawing anything (2D ortho setup)

I am using OpenGL for 2D rendering and would like to use actual pixel coordinates. By this, I mean that I would like (0,0) to be in the top left of the window, and (width,height) to be in the bottom right of the window (where width and height are the window’s dimensions in pixels). To do this, I use a projection matrix which is generated with glOrtho, and then passed to a vertex shader:

GL11.glViewport(0, 0, width, height);
GL11.glDisable(GL11.GL_DEPTH_TEST);
GL11.glMatrixMode(GL11.GL_PROJECTION);
GL11.glOrtho(0f, width, height, 0f, -1f, 1f);
GL11.glGetFloat(GL11.GL_PROJECTION_MATRIX, projectionBuffer);
GL11.glLoadIdentity();

I am using LWJGL, which does not have bindings for glm so I obtain a 2D orthographic matrix using the OpenGL calls above. I reset the projection matrix so it does not affect my later draw calls. After this, the projectionBuffer, a FloatBuffer, is filled with the projection matrix generated by glOrtho.

The projection matrix produced looks like this (I don’t know if this is helpful):

0.0015625 0.0 0.0 1.0
0.0 -0.0027777778 0.0 -1.0
0.0 0.0 -1.0 0.0
0.0 0.0 0.0 1.0

My vertex shader looks like this:

#version 330 core

layout (location = 0) in vec4 vertex;

out vec2 TexCoords;

uniform mat4 model;
uniform mat4 projection = mat4(1.0);

void main()
{
    TexCoords = vertex.zw;
    gl_Position = projection * model * vec4(vertex.x, vertex.y, 0.0, 1.0);
}

When I initialise the shader, I use glUniformMatrix4 to set the projection matrix’s value. I’m certain this is done successfully as when I use glGetUniform after, it returns the same projection matrix.

The model matrix is produced when every textured quad is to be drawn. Each quad shares the same vertices and uvs which are all stored in a single VBO in the same VAO. The vertex data is shared between each quad that is drawn, with a different model matrix applied to each. The model matrix is calculated correctly to produce real screen coordinates. For example, a square with its top left at (0,0) and width/height of 128 would produce the following model matrix:

128.0 0.0 0.0 0.0
0.0 128.0 0.0 0.0
0.0 0.0 1.0 0.0
0.0 0.0 0.0 1.0

This model matrix is passed to the shader successfully using glUniformMatrix4 and I have checked this.

To initialise the shared VAO with the quad’s vertex data, I use the following code:

vao = GL30.glGenVertexArrays();
int vbo = GL15.glGenBuffers();
GL30.glBindVertexArray(vao);
GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, vbo);
GL15.glBufferData(GL15.GL_ARRAY_BUFFER, QUAD_BUFFER, GL15.GL_STATIC_DRAW);

GL20.glEnableVertexAttribArray(0);
GL20.glVertexAttribPointer(0, 4, GL11.GL_FLOAT, false, 16, 0);
GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, 0);
GL30.glBindVertexArray(0);

QUAD_BUFFER refers to a float[] containing the vertex and texture coordinate data.

Finally, to draw a textured quad, I use the following:

shader.setMatrix4f("model", model);//Makes a call to glUniformMatrix4 - model is the model matrix as explained above.
GL13.glActiveTexture(GL13.GL_TEXTURE0);
texture.bind(); 
GL30.glBindVertexArray(vao);
GL11.glDrawArrays(GL11.GL_TRIANGLES, 0, 6);

The problem is, when I run the application, nothing is drawn on the window; it remains completely black. When not using the shader, I can draw shapes using the old method (glBegin etc.). I cannot figure out what I’m doing wrong. I suspect its something to do with the projection matrix causing the vertices to be off the window.

Your matrix is inconsistent with what a call to glOrtho(0, 1280, 720, 0, -1, 1) would produce. Given the identity matrix of the currently active matrix stack’s top matrix, a call to glOrtho(0, 1280, 720, 0, -1, 1) will produce:


 1,563E-3  0,000E+0  0,000E+0 -1,000E+0
 0,000E+0 -2,778E-3  0,000E+0  1,000E+0
 0,000E+0  0,000E+0 -1,000E+0  0,000E+0
 0,000E+0  0,000E+0  0,000E+0  1,000E+0

You see that the translation part has inverted signs.
Your posted matrix essentially results in your coordinate system having its origin in the bottom-right corner.
You probably called other matrix stack methods before your glOrtho call. If you want to make sure that glOrtho produces the correct result, you must call glLoadIdentity before, since glOrtho multiplies/applies its result to the current top matrix of the active matrix stack.

Thank you so much! You solved my issue - I loaded the identity into the projection matrix before calling glOrtho as you suggested and it worked perfectly!

If you want to avoid having to “abuse” OpenGL’s matrix stack just to obtain a matrix for use in your shaders (especially when using GL >= 3.2 core profile, where the matrix stack is not available anymore), have a look at:

Thanks for the advice - I was planning to switch to using glm at some point but hadn’t got around to looking for a Java port.