OpenGL rendering problem

I am actually learning how to use OpenGL with the book “Learning Modern 3D Graphics Programming”. I’m using the OpenGL port to Java, LWJGL, but i can’t get to work one thing.

What i’m doing is trying, from a set of 3 vertices (each of which has 4 coords, XYZW) that passes unmodified from a vertex shader, and from a fragment shader to set a global color of white (1.0, 1.0, 1.0, 1.0) to render a white triangle. These vertices are put in a vertex buffer, and drawn to the screen with the GL_TRIANGLES option.

Here’s the code: http://pastebin.java-gaming.org/8a9c60f8390

What i see is actually a black window (as specified with glClearColor) but the triangle is absent. Any solutions/fixes?

Unless I’m missing something really obvious I can’t see why anything would be displayed - you never invoke the initializeProgram and displayTriangle methods! The display is black because IIRC that’s the default background colour.

Add the init method after you’ve created the Display and the render method in the update loop at the end of the constructor.

I also suggest you add some temporary ‘logging’ println messages and/or use your debugger to verify that the code is doing what you think it is. Also change the values in glClearColour to be sure that you’re not using the default.

  • stride

Sorry, i had removed a part of code, the updated is (after the try/catch block in Main):


initializeProgram();
initializeBuffer();

while (!Display.isCloseRequested()) {
	displayTriangle();

	Display.update();
	Display.sync(TARGET_FPS);
}

But still nothing works. The shaders are compiled without errors.

The other thing that caught my eye was this:

for (int shader : shaders)
         GL20.glDetachShader(program, shader);

at the bottom of createShaderProgram, which looks to me to be undoing everything else in that method!?

May I see your vertex shader?

Edit: Sorry, didn’t realize it was stored as a string in the code.

I just can’t find the error here. If i remove the glDetachShader line the program behaves the same… and i think that when you have a program you can delete your shaders.

So basically your problem is that you are not setting up the matrices(model view projection) to be multiplied by the vertex, resulting in it not showing up.

So how do you think the code should be? Since i’m new to OpenGL i don’t know how to do what you said above.

Matrices in OpenGL are a very important concept. They basically define the viewing space of the screen. There are 3 main types of matrices in OpenGL: model, view and projection. Each mesh has its own model matrix which defines how it is transformed into world space. The view matrix is independent of the model matrix. It basically defines where the camera is, and is applied to every model after the model matrix. The projection matrix is what you can think of as your eyes, or the lenses of a camera. It defines what you can actually see and is finally multiplied by the modelview matrix.

I just gave you a quick vague explanation, but if you want to learn more, I recommend you see this article: http://www.opengl-tutorial.org/beginners-tutorials/tutorial-3-matrices/

So back in the days of OpenGL all of the matrix math was done for you. You could just do a few function calls to set up the projection, and easily render 3D scenes. Modern versions, on the other hand, expect you to do all of that matrix math yourself. This is for a good reason too. OpenGL is not a math library, it is purely a graphics library. Since you are using a modern GL shading version, you would need to set up the matrices yourself. The article I linked has some great explanations on how you would go about setting up implementations for them, too.

This is just temporary code to show you how it’s done without getting into how to set up matrices. I recommend you completely move into the modern way of doing things, which you pretty much already are, but implement matrices and pass them in as uniform variables to the vertex shader.

A few other things. You weren’t flipping the buffer after you filled it with data which was a whole other problem which would have hindered it from working. I also moved the vertices back on the z axis a little so they wouldn’t be clipped by the near clipping plane defined in the gluPerspective method call. One last thing I need to point out is that I used the lwjgl-utils library for the gluPerspective method.

Here is your problem fixed using a bit older version of GL to set up the matrices:


// whatever package
import static org.lwjgl.opengl.GL11.GL_MODELVIEW;
import static org.lwjgl.opengl.GL11.GL_PROJECTION;
import static org.lwjgl.opengl.GL11.glLoadIdentity;
import static org.lwjgl.opengl.GL11.glMatrixMode;
import static org.lwjgl.util.glu.GLU.gluPerspective;

import java.nio.FloatBuffer;

import org.lwjgl.BufferUtils;
import org.lwjgl.LWJGLException;
import org.lwjgl.opengl.Display;
import org.lwjgl.opengl.DisplayMode;
import org.lwjgl.opengl.GL11;
import org.lwjgl.opengl.GL15;
import org.lwjgl.opengl.GL20;
import org.lwjgl.opengl.GL30;

public class Test
{
	public static final int WIDTH = 800;
	public static final int HEIGHT = 600;

	public static final String TITLE = "Learning OpenGL";

	public static final boolean FULLSCREEN = false;
	public static final boolean RESIZABLE = false;
	public static final boolean VSYNC = false;

	public static final int TARGET_FPS = 60;

	private int program;

	private int vertexBuffer;
	private int vertexArray;

	private float[] vertexPositions = new float[] { 0.75f, 0.75f, -10f, 1.0f, 0.75f, -0.75f, -10f, 1.0f, -0.75f, -0.75f, -10f, 1.0f };

	// modern shaders
	// private String vertexShader = "#version 330\n" + "layout(location = 0) in vec4 position;\n" + "void main()\n" + "{\n" + "   gl_Position = position;\n" + "}\n";
	// private String fragmentShader = "#version 330\n" + "out vec4 outputColor;\n" + "void main()\n" + "{\n" + "   outputColor = vec4(1.0f, 1.0f, 1.0f, 1.0f);\n" + "}\n";

	/*
	 * older shaders(they still work though)
	 * gl_Vertex is a built in attribute for the actual vertex position.
	 * gl_ModelViewProjection matrix is the built in matrix uniform(it is the product of the model view and projection matrices)
	 */
	private String vertexShader2 = "#version 110\n" + "void main()\n" + "{\n" + "gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;\n" + "}\n";
	private String fragShader2 = "#version 110\n" + "void main()\n" + "{\n" + "gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);\n" + "}\n";

	private int createShader(int type, String code)
	{
		int shader;

		shader = GL20.glCreateShader(type);

		GL20.glShaderSource(shader, code);
		GL20.glCompileShader(shader);

		if (GL20.glGetShaderi(shader, GL20.GL_COMPILE_STATUS) == GL11.GL_FALSE)
		{
			System.out.println("Failed to compile a shader.");
			System.out.println(GL20.glGetShaderInfoLog(shader, 5000));
			System.exit(1);
		}

		return shader;
	}

	private int createShaderProgram(int[] shaders)
	{

		int program;

		program = GL20.glCreateProgram();

		for (int shader : shaders)
			GL20.glAttachShader(program, shader);

		GL20.glLinkProgram(program);

		if (GL20.glGetProgrami(program, GL20.GL_LINK_STATUS) == GL11.GL_FALSE)
		{
			System.out.println("Failed to link a program.");
			System.exit(1);
		}

		for (int shader : shaders)
			GL20.glDetachShader(program, shader);

		return program;
	}

	/*
	 * Initialize the vertex buffer, that reads the vertex positions from the compiled shaders
	 */
	public void initializeBuffer()
	{
		vertexBuffer = GL15.glGenBuffers();

		FloatBuffer vertexPositionsBuffer = BufferUtils.createFloatBuffer(vertexPositions.length);
		vertexPositionsBuffer.put(vertexPositions);

		// NEVER forget to flip your buffers for OpenGL!
		vertexPositionsBuffer.flip();

		GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, vertexBuffer);
		GL15.glBufferData(GL15.GL_ARRAY_BUFFER, vertexPositionsBuffer, GL15.GL_STATIC_DRAW);
		GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, 0);
	}

	/*
	 * Initialize the program by compiling shaders and linking the shader program
	 */
	public void initializeProgram()
	{
		int[] shaderList = new int[] { createShader(GL20.GL_VERTEX_SHADER, vertexShader2), createShader(GL20.GL_FRAGMENT_SHADER, fragShader2) };

		program = createShaderProgram(shaderList);

		vertexArray = GL30.glGenVertexArrays();
		GL30.glBindVertexArray(vertexArray);
	}

	/*
	 * Display the triangle in the window
	 */
	public void displayTriangle()
	{
		GL11.glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
		GL11.glClear(GL11.GL_COLOR_BUFFER_BIT);

		GL20.glUseProgram(program);

		GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, vertexBuffer);
		GL20.glEnableVertexAttribArray(0);
		GL20.glVertexAttribPointer(0, 4, GL11.GL_FLOAT, false, 0, 0);

		GL11.glDrawArrays(GL11.GL_TRIANGLES, 0, 3);

		GL20.glDisableVertexAttribArray(0);
		GL20.glUseProgram(0);
	}

	public Test()
	{

		try
		{
			Display.setDisplayMode(new DisplayMode(WIDTH, HEIGHT));
			Display.setTitle(TITLE);
			Display.setFullscreen(FULLSCREEN);
			Display.setResizable(RESIZABLE);
			Display.setVSyncEnabled(VSYNC);

			Display.create();
		} catch (LWJGLException e)
		{
			System.out.println("Display initialization error.");
			System.exit(0);
		}

		/*
		 * This is the main change. Define the projection matrix.
		 */
		glMatrixMode(GL_PROJECTION);
		glLoadIdentity();
		gluPerspective(70, WIDTH / HEIGHT, 0.1f, 1000f);
		glMatrixMode(GL_MODELVIEW);

		initializeProgram();
		initializeBuffer();

		while (!Display.isCloseRequested())
		{
			displayTriangle();
			Display.update();
			Display.sync(TARGET_FPS);
		}
	}

	public static void main(String[] args)
	{
		new Test();
	}
}

Thanks a lot, ForeseenParadox. Now it’s all more clear. Thanks for support.

With your code, matrices shouldn’t be a problem. Don’t try to use things like GL_MODELVIEW and GL_PROJECTION. In modern OpenGL, you create the matrices yourself and then pass them in as uniforms. But, like I said, your code will work without using any matrices.

Have you tried printing glGetError(); in your render method?

In fact, i got it work by following the Java port of my book code (that is C++), here you can see my classes:

Window.java: http://pastebin.java-gaming.org/a9c6f13809c
Triangle.java: http://pastebin.java-gaming.org/9c6f3208c93

However, thanks to all for your support. OpenGL is a difficult field for newbies. Especially italians! :smiley:

However, i think that i was missing glViewport(0, 0, WIDTH, HEIGHT); at the start.