Do I have to use VAOs?

I just updated from my shitty ASUS drivers to actually working, non-crashing, non-flickering video drivers from Nvidia. However, this broke about half of my games and tests. The problem only occurs in OpenGL 3 games that use buffers but do not use Vertex Array objects for the shader attribute mapping (glVertexAttribPointer and glEnableVertexAttribArray). I tried to reproduce the problem in a small test program, but that test works both with and without VAOs. I am so confused about it. I managed to reproduce it. See the bottom of this post.
The fact remains that my main project is affected by this (in an “it worked before, but not anymore” way). As Nvidia is known for bending rules in GLSL with automatic casting e.t.c. could this be something similar? Do I HAVE to use VAOs? Is it illegal to call VertexAttrib functions without a VAO binded? It’s the only thing I can think of that could be “fixed” between two drivers.

I’ve created a small program to show what happens. (It didn’t work at first because I was using a backwards compatible context.) Just copy this into a class, add the LWJGL library, setup the LWJGL natives and run it. Should open a window and render a colored triangle in the top right quadrant of the window.
REQUIRES: an OpenGL 3.2 ( = DirectX 10 compatible) compatible video card. If you have an OpenGL 3.2 compatible card but it doesn’t work, update your drivers.
To switch on and off VAOs, change the constant USE_VAO.

Here is the code:

import java.nio.FloatBuffer;
import java.nio.ShortBuffer;
import org.lwjgl.BufferUtils;
import org.lwjgl.opengl.ContextAttribs;
import org.lwjgl.opengl.Display;
import org.lwjgl.opengl.DisplayMode;
import org.lwjgl.opengl.GL11;
import org.lwjgl.opengl.GL15;
import org.lwjgl.opengl.GL20;
import org.lwjgl.opengl.GL30;
import org.lwjgl.opengl.PixelFormat;

public class VAOTest {

    private static final boolean USE_VAO = false;

    private void run() throws Exception {

        //Display setup
        Display.setDisplayMode(new DisplayMode(800, 600));
        Display.create(new PixelFormat(), new ContextAttribs(3, 2).withProfileCompatibility(false));


        //Vertex shader
        int vertexShader = GL20.glCreateShader(GL20.GL_VERTEX_SHADER);
        GL20.glShaderSource(vertexShader, getVertexShaderSource());
        GL20.glCompileShader(vertexShader);

        String vertexShaderErrorLog = GL20.glGetShaderInfoLog(vertexShader, 65536);
        if (vertexShaderErrorLog.length() != 0) {
            System.err.println("Vertex shader compile log: \n" + vertexShaderErrorLog);
        }


        //Fragment shader
        int fragmentShader = GL20.glCreateShader(GL20.GL_FRAGMENT_SHADER);
        GL20.glShaderSource(fragmentShader, getFragmentShaderSource());
        GL20.glCompileShader(fragmentShader);

        String fragmentShaderErrorLog = GL20.glGetShaderInfoLog(fragmentShader, 65536);
        if (fragmentShaderErrorLog.length() != 0) {
            System.err.println("Fragment shader compile log: \n" + fragmentShaderErrorLog);
        }


        //Shader program
        int program = GL20.glCreateProgram();
        GL20.glAttachShader(program, vertexShader);
        GL20.glAttachShader(program, fragmentShader);

        GL20.glLinkProgram(program);
        String log = GL20.glGetProgramInfoLog(program, 65536);
        if (log.length() != 0) {
            System.err.println("Program link log:\n" + log);
        }


        //Get vertex location
        int vertexLocation = GL20.glGetAttribLocation(program, "inVertex");
        if (vertexLocation == -1) {
            System.err.println("SHADER ERROR: Attribute 'inVertex' does not exist!");
        }
        int colorLocation = GL20.glGetAttribLocation(program, "inColor");
        if (colorLocation == -1) {
            System.err.println("SHADER ERROR: Attribute 'inColor' does not exist!");
        }


        //Create vertex data
        FloatBuffer vertexData = BufferUtils.createFloatBuffer(5 * 3);
        vertexData.put(new float[]{
                    //x, y, r, g, b
                    0, 0, 1, 0, 0, //1st vertex
                    1, 0, 0, 1, 0, //2nd
                    0, 1, 0, 0, 1 //3rd
                });
        vertexData.flip();


        //Create index data
        ShortBuffer indexData = BufferUtils.createShortBuffer(3);
        indexData.put(new short[]{0, 1, 2});
        indexData.flip();


        //Buffer creation and loading
        int vertexBuffer = GL15.glGenBuffers();
        GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, vertexBuffer);
        GL15.glBufferData(GL15.GL_ARRAY_BUFFER, vertexData, GL15.GL_STATIC_DRAW);

        int indexBuffer = GL15.glGenBuffers();
        GL15.glBindBuffer(GL15.GL_ELEMENT_ARRAY_BUFFER, indexBuffer);
        GL15.glBufferData(GL15.GL_ELEMENT_ARRAY_BUFFER, indexData, GL15.GL_STATIC_DRAW);


        //Setup VAO if needed
        int vao = -1;
        if (USE_VAO) {
            vao = GL30.glGenVertexArrays();
            GL30.glBindVertexArray(vao);

            GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, vertexBuffer); //Already binded, but whatever...
            GL20.glVertexAttribPointer(vertexLocation, 2, GL11.GL_FLOAT, false, 5 * 4, 0);
            GL20.glEnableVertexAttribArray(vertexLocation);
            GL20.glVertexAttribPointer(colorLocation, 3, GL11.GL_FLOAT, false, 5 * 4, 2 * 4);
            GL20.glEnableVertexAttribArray(colorLocation);

            GL15.glBindBuffer(GL15.GL_ELEMENT_ARRAY_BUFFER, indexBuffer);

            GL30.glBindVertexArray(0); //Unbind
        }


        //Clear color
        GL11.glClearColor(0, 0, 0, 0);


        while (!Display.isCloseRequested()) {
            //Clear screen
            GL11.glClear(GL11.GL_COLOR_BUFFER_BIT);

            //Enable shader
            GL20.glUseProgram(program);

            //Either use VAO or enable vertex attributes manually
            if (USE_VAO) {
                GL30.glBindVertexArray(vao);
            } else {
                GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, vertexBuffer);
                GL20.glVertexAttribPointer(vertexLocation, 2, GL11.GL_FLOAT, false, 5 * 4, 0);
                GL20.glEnableVertexAttribArray(vertexLocation);
                GL20.glVertexAttribPointer(colorLocation, 3, GL11.GL_FLOAT, false, 5 * 4, 2 * 4);
                GL20.glEnableVertexAttribArray(colorLocation);

                GL15.glBindBuffer(GL15.GL_ELEMENT_ARRAY_BUFFER, indexBuffer);
            }

            int error = GL11.glGetError();
            if (error != 0) {
                System.out.println("OpenGL error '" + error + "' occured!");
            }

            //Draw triangle
            GL11.glDrawElements(GL11.GL_TRIANGLES, 3, GL11.GL_UNSIGNED_SHORT, 0);

            Display.update();
        }
    }

    public static void main(String[] args) throws Exception {
        new MiniTest().run();
    }

    public String getVertexShaderSource() {
        return "#version 330\n"
                + "in vec2 inVertex;"
                + "in vec3 inColor;"
                + "out vec3 fColor;"
                + "void main(){"
                + "gl_Position = vec4(inVertex, 0, 1);"
                + "fColor = inColor;"
                + "}";
    }

    public String getFragmentShaderSource() {
        return "#version 330\n"
                + "in vec3 fColor;"
                + "layout(location = 0) out vec4 fragColor;"
                + "void main(){"
                + "fragColor = vec4(fColor, 1);"
                + "}";
    }
}

Tested with a NVidia GTX 460M. Only works with USE_VAO set to true for me. Would be awesome if somebody with a Radeon card could try this too.

Windows 7 64 Ultimate
DirectX 11
NVidia Geforce GTS 250
latest Drivers

Open Gl should be 3.3.0

Best case scenario, still getting this: (with use_vao = false)

[quote]OpenGL error ‘1282’ occured!
[/quote]
with use_vao = true, works fine.

Okay, I guess you have to use VAOs in OpenGL 3.0+… Good to know. Must have been a bug in the old drivers that allowed you to not use them… Great, now I got a whole lot of shit to fix. -.-

well dont write code that only works if there is OpenGL 3.0+
even I have only 3.3

I never use anything above 1.1

or at least you have to code it so that if 3+ isn’t available it still works with lower gl’s, then just looks not as good

I’ve started to only use OpenGL 3+. I see no reason to dwell in the past. Heck, DX10 cards are like 5 years old now, aren’t they? Limiting the game to them just makes graphics cards that couldn’t run the game decently for performance reasons unusable (or because they just sucked hint Intel cards hint). There is also a big difference between OGL 3 and older versions, as all fixed functionality is gone, making it a lot of work to support both OGL 3 and older versions.
In practice I don’t see a problem with this. The game I’m making will use FBOs, which were only just promoted to core in 3.0. The only cards that haven’t supported OGL 3.0 so far are Intel cards. Even though they could do DX9 (which means FBOs) they didn’t even support the OGL FBO extension. They won’t be able to run the game anyway, so I see no reason not to use OGL 3+ only from now on.

Oh, I’m so rude… Cero! Thank you a lot for testing!!!

FYI Mac’s drivers have not been updated to support OpenGL 3.x even though they all come with cards that support it. This will HOPEFULLY (glares at Apple) be fixed soon, but if you have any desire to test or run on Macs, you won’t be able to use OpenGL 3.x until some time in the future.

Bwahahaha, soon you’re gonna say that they don’t have DirectX 10 support either!

Wait a minute…

So Apple are being retarded again? Well, it’s great that the biggest electronics company on earth hasn’t implemented a FUCKING 3 YEAR OLD GRAPHICS API YET. Ridiculous. Is there even any way at all to access the DirectX 10 level functions of your graphics card? I’m so gonna laugh my ass of if there isn’t.

I ran windows XP until recently, and still do (dual boot linux) on all my other machines + laptop

They have many of the extensions for 3.0, like FBOs and I think VAOs, but they don’t have the GLSL language support which is a major downer. It was supposed to be released with Mac 10.6.3 (we’re on 10.6.8 + 10.7 BTW).

But OpenGL 3+ works on Windows XP, right? Isn’t that one of the selling points of OpenGL? If so, then I think I can live with Macs being late to my party. :-\

I don’t have a Mac, but I am sure they support GLSL for a long time…

I think he meant the newer versions of GLSL.

Mac’s have versions up to 1.2 in GLSL. The reason XP has support and Mac doesn’t is because XP allows the card manufacturers to implement and install their drivers whenever they have updates. Mac needs to have everything go through them, and for some reason it has not been a high priority.

Probably because they’re too busy selling iPads to your 5 year old niece so they can play Angry Birds and be content.

+1
Because a Box2D-based destroy-the-castle game is a unique and intuitive idea that has never been done 1000s of times before on Armorgames. :cranky:

YES. I really really hate Angry Birds.