Rendering textures via VBO

Forgot about that post, but everything is working except I don’t see the image/texture drawn.
My rendering code:

glClear(GL_COLOR_BUFFER_BIT);
            
            vertexData.clear();


            
            
            glBindBuffer(GL_ARRAY_BUFFER, vboVertexHandle);
            glVertexPointer(vertexSize, GL_FLOAT, 0, 0L);

            glBindBuffer(GL_ARRAY_BUFFER, vboColorHandle);
            glColorPointer(colorSize, GL_FLOAT, 0, 0L);
            
            glBindBuffer(GL_ARRAY_BUFFER, vboTextureUVHandle);
         // 2 Because we have 2 texture coordinates per-vertex
         glTexCoordPointer(2, GL_FLOAT, 0, 0L);

            glEnableClientState(GL_VERTEX_ARRAY);
            glEnableClientState(GL_COLOR_ARRAY); 
            glEnableClientState(GL_TEXTURE_COORD_ARRAY);
            glDrawArrays(GL_TRIANGLES, 0, amountOfVertices);
            glBindTexture(GL_TEXTURE_2D, myTextureID);
            glDisableClientState(GL_COLOR_ARRAY);
            glDisableClientState(GL_VERTEX_ARRAY);
            glDisableClientState(GL_TEXTURE_COORD_ARRAY);
glDrawArrays(GL_TRIANGLES, 0, amountOfVertices);
glBindTexture(GL_TEXTURE_2D, myTextureID);

to

glBindTexture(GL_TEXTURE_2D, myTextureID);
glDrawArrays(GL_TRIANGLES, 0, amountOfVertices);

Also,

GL11.glEnable(GL11.GL_TEXTURE_2D);

glBindTexture(GL_TEXTURE_2D, myTextureID);
glTexImage2D(...);

http://pastebin.java-gaming.org/6e10a2291051d
Still not rendered. Am I supposed to repeat some things that I call in the render method?
EX:

glBindTexture();

I call it before and in the render loop?

Un comment the drawArrays :cranky:

Also, I forgot this:

//Specifies how the texture will be handled when out - of - range
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL12.GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL12.GL_CLAMP_TO_EDGE);

//Specifies how the texture should interpolate scaling (GL_NEAREST for nearest-neighbour, and GL11.GL_LINEAR for blurry stuffs)
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL11.GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL11.GL_NEAREST);

int myTextureID = glGenTextures();
glBindTexture(GL_TEXTURE_2D, myTextureID);        
GL11.glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB8, image.getWidth(), image.getHeight(), 0, GL_RGBA, GL_UNSIGNED_BYTE, dataFromImage(image));

It works, now how do I make this triangle a quad. I tried changing the vertex data.

                            -0.5f, -0.5f, 0,
        							0.5f, -0.5f, 0,
        							0.5f, 0.5f, 0,
        							
        							-0.5f, -0.5f, 0,
        							0.5f, 0.5f, 0,
        							0.5f, -0.5f, 0,

Nice! Now the texture coordinates should be proportional to the vertex locations. Like this:

0, 0,
1, 0,
1, 1,

0, 0,
1, 1,
1, 0

It didn’t work right. I’ll just figure out how to arrange the data to make a quad, but can I get a brief lecture on how the textcoords work?

what does this stand for.

0, 0,
1, 0,
1, 1,

0, 0,
1, 1,
1, 0

and is there a way I can change the ortho so I don’t have to use floats for drawing shapes and such

Use the 0s and 1s I gave you in your tex-coord buffer.

Texture coordinates basically state which part of the texture corresponds to which part of the geometry (Your vertices). The UV system is just a way of representing that relationship.

As for the ‘ortho’: Get a book about OpenGL (OpenGL Super-Bible 6th edition is good). Also learn vector/matrix math, SUPER useful in graphics programming / game development

//The matrix used to multiply vector locations in a projection sense
glMatrixMode(GL_PROJECTION);
//Loads the identity matrix into projection (google Identity Matrix)
glLoadIdentity();
//Set the current matrix (projection) to have an ortho projection
glOrtho2D(0, Display.getWidth(), Display.getHeight(), 0, -1, 1);
//Model-view: The model and view matrix together
glMatrixMode(GL_MODELVIEW);

You shouldn’t be doing this every frame. Do it once when you load the texture. I noticed that it was in your loop JayManHall.

These are the UV coordinates of the texture that you are using. Each coordinate (each line for U, V) says what part of the texture that vertex should represent. (0, 0) starts in the top-left, and (1, 1) is in the bottom right. They’re basically like Cartesian coordinates for the texture.

So, this code is saying that the first vertex should take the top-left corner of the texture, the 2nd vertex should be the top-right corner of the texture, the 3rd should be the bottom-right, and so on.

You can multiply your vertex positions by an orthographic projection matrix, which will let you use units like pixels, and will normalize those coordinates for opengl to use.

Don’t use this. Instead, jump straight into modern opengl. Don’t try to tie the deprecated and programmable pipelines together.

Thanks! Is this Vertex Data correct for making a square? It throws an exception pointing the the data i put in.

               -0.5f, -0.5f, 0,
                0.5f, -0.5f, 0,
                0.5f, 0.5f, 0,
                
                0.5f, 0.5f, 0,
                -0.5f, -0.5f, 0,
                0.5f, -0.5f, 0

Stacktrace please?

Exception in thread "main" java.nio.BufferOverflowException
	at java.nio.DirectFloatBufferU.put(Unknown Source)
	at java.nio.FloatBuffer.put(Unknown Source)
	at RenderingModes.VertexBufferObjectDemo.main(VertexBufferObjectDemo.java:82)

VBOs aren’t programmable pipeline… And using the built-in matrices doesn’t make it fixed ( I think ). You can use the built-in matrices with shaders. [icode]gl_Vertex … gl_ProjectionMatrix … gl_ModelViewMatrix[/icode] are keywords in GLSL.

Maybe change the ‘amount’ of vertices param to 6, seeing as you do have 6 vertices and 3 components (X, Y, Z) per-vertex

What is then?

The built in matrices are a part of the fixed pipeline. Those keywords are a kind of bridge between the old matrices and the programmable shaders. If you want to be fully programmable, then you compute and upload your own matrices as mat4’s.

Agreed.

Had no effect

EDIT:

   final int amountOfVertices = 6;
        final int vertexSize = 3;
        final int colorSize = 3;

        FloatBuffer vertexData = BufferUtils.createFloatBuffer(amountOfVertices * vertexSize);
        vertexData.put(new float[]{-0.5f, 0.5f, 0f,
        		-0.5f, -0.5f, 0f,
        		0.5f, -0.5f, 0f,
        		// Right top triangle
        		0.5f, -0.5f, 0f,
        		0.5f, 0.5f, 0f,
        		-0.5f, 0.5f, 0f

        	                      });
        vertexData.flip();

        FloatBuffer colorData = BufferUtils.createFloatBuffer(amountOfVertices * colorSize);
        colorData.put(new float[]{1, 1, 1, 1, 1, 1, 1, 1, 1/**/});
        colorData.flip();
        
        FloatBuffer textureUVData = BufferUtils.createFloatBuffer(amountOfVertices * 2); // amount of vertices * two texture coordinates per-vertex
        textureUVData.put(new float[]{0, 0,
        		1, 0,
        		1, 1,

        		0, 0,
        		1, 1,
        		1, 0
});
        textureUVData.flip();
        

You need to add more colors. You only have enough color data for 3 vertices.

Programming the pipeline? Like shaders? How you give vertices to the beginning fetch is up to you. What makes a pipeline programmable is… Programming the pipeline, not the data you send down it.

I guess this is true, but it does the job, and you don’t need to hard-code projection matrices… I guess they’re kinda fixed-function.

Actually, they wouldn’t need a ‘bridge’, because you’re just running a program that’s invoked on every vertex/pixel, so it would be primarily the same data that was sent down before. Just used in a different context.

I personally like the old [icode]gl_Vertex[/icode] method, it supports immediate mode, and the good old VBOs. Although I do use my own uniform matrices with it, :wink:

added the exact same color data and um…

http://i.minus.com/iRG1esBDF9OdU.gif

Shaders aren’t the only part of the programmable pipeline. They are really the main part, but VBOs are really the only way to upload the data.

The shaders were introduced with Opengl 2. This was before everything was deprecated in 3+. Thus, this was the best way at the time to handle your matrices. Now that they’re deprecated, there are other (better) ways to handle and send data do the shaders.

Why would you ever want to use immediate mode? Whenever I see a statement like this, I cringe. You don’t ever need to worry about whether something is compatible with immediate mode or not, because you should just never use it. This is another reason to stop using gl_Vertex…