Vertex Arrays and interleaved ByteBuffers

Hello guys :slight_smile:

I’ve got a problem with my current VertexArray implementation. Not like most implementations in java it uses a ByteBuffer instead of a FloatBuffer. Probably this does not work, but it should, shouldn’t it?

The code for the VertexArray class is the following:


package org.matheusdev.arcengine.ogl;

import java.nio.ByteBuffer;
import java.util.List;

import javax.media.opengl.GL2;

import org.matheusdev.arcengine.utils.BufferUtils;

/**
 * @author matheusdev
 *
 */
public class VertexArray {

	protected final List<ShaderAttrib> attributes;
	protected final int totalComponents;
	protected final int totalByteLength;
	protected final ByteBuffer buffer;

	/**
	 * Alias for <code>VertexArray(glslShader.getAttributes(), vertices)</code>.
	 */
	public VertexArray(ShaderProg glslShader, int vertices) {
		this(glslShader.getAttributes(), vertices);
	}

	public VertexArray(List<ShaderAttrib> attributes, int vertices) {
		this.attributes = attributes;

		int components = 0;
		int stride = 0;
		for (ShaderAttrib attribute : attributes) {
			components += attribute.components;
			stride += attribute.getTypeByteSize();
		}
		this.totalComponents = components;
		this.totalByteLength = stride;
		this.buffer = BufferUtils.newByteBuffer(totalByteLength * vertices);
	}

	public void begin(GL2 gl) {
		gl.glEnableClientState(GL2.GL_ARRAY_BUFFER);

		int offset = 0;
		for (ShaderAttrib attribute : attributes) {
			buffer.position(offset);

			gl.glEnableVertexAttribArray(attribute.location);
			gl.glVertexAttribPointer(
					attribute.location,
					attribute.components,
					attribute.getDataType(),
					false,
					totalByteLength,
					buffer);

			System.out.printf("glEnableVertexAttribArray(%d)%n", attribute.location);
			System.out.printf("glVertexAttribPointer(%n\tindx: %d,%n\tsize: %d,%n\ttype: %d,%n\tnormalized: false,%n\tstride: %d,%n\tbuffer: %s(offset = %d));%n",
					attribute.location, attribute.components, attribute.getDataType(), totalByteLength, buffer, offset);

			offset += attribute.byteLength;
		}
	}

	public void render(GL2 gl, int mode, int offset, int count) {
		gl.glDrawArrays(mode, offset, count);
	}

	public void end(GL2 gl) {
		for (ShaderAttrib attribute : attributes) {
			gl.glDisableVertexAttribArray(attribute.location);
		}
		gl.glDisableClientState(GL2.GL_ARRAY_BUFFER);
	}

	@Override
	public String toString() {
		StringBuffer str = new StringBuffer();
		str.append(String.format("VertexArray[components: %d, stride: %d, attributes:\n", totalComponents, totalByteLength));
		for (ShaderAttrib attribute : attributes) {
			str.append("\t").append(attribute).append("\n");
		}
		str.append("];");
		return str.toString();
	}

	public ByteBuffer getBuffer() {
		return buffer;
	}

	public int getComponentSum() {
		return totalComponents;
	}

	public int getComponentByteSum() {
		return totalByteLength;
	}

}

I’ve tested the [icode]ShaderProg[/icode] class, it works perfectly with [icode]glBegin(); / glEnd();[/icode]. The array’s [icode].toString();[/icode] prints the following:

So the attributes of the shader are loaded properly. The stride is in bytes and seems to be okay (components = 2 = vec2, all floats, which means stride = 2 components * 4 bytes = 8 bytes).

The ByteBuffer is properly filled by me, with this code:


		onlyVerts = new VertexArray(Arrays.asList(
				onlyVertShader.getAttribute("aVertex")), 4);
		ByteBuffer buf0 = onlyVerts.getBuffer();
		buf0
			.putFloat(256).putFloat(0)
			.putFloat(384).putFloat(0)
			.putFloat(384).putFloat(128)
			.putFloat(256).putFloat(128)
			.flip();

It’s content is then tested with this and reset:


while (buf0.hasRemaining()) {
	System.out.printf("Vertex: aVertex(%G, %G)%n",
			buf0.getFloat(), buf0.getFloat());
}
buf0.position(0);

This is getting printed:

Finally this is getting print from the VertexArray#begin(gl) method:

The array is rendered like this:


onlyVertShader.use(gl);
onlyVerts.begin(gl);
onlyVerts.render(gl, GL2.GL_QUADS, 0, 1);
onlyVerts.end(gl);

The vertex shader is the following:


uniform mat4 uViewProjMatrix;

attribute vec2 aVertex;

void main(void) {
	gl_Position = uViewProjMatrix * vec4(aVertex, 0.0, 1.0);
}

The fragment shader:


uniform vec3 color;

void main(void) {
	gl_FragColor = vec4(color, 1.0);
}

The uniform “vec3 color” is set to (0, 1, 1); Setting uniforms works with my ShaderProg class, it is tested with glBegin(); / glEnd();.

The problem with this is, that NOTHING is drawn… just black. Other stuff drawn in the rendering loop with non-interleaved FloatBuffers and vertex arrays (not using the VertexArray class) and shaders work, and glBegin(); / glEnd(); work too.
Just the stuff which should be drawn with the VertexArray class is not drawn. Just black, transparent, nothing.

Probably this is due to the arguments I give to [icode]glVertexAttribPointer(…);[/icode], but I can’t find whats wrong with it :frowning:

I really need help! I’ve googled for years, as it feels now, and I have this issue longer than a week, and I just can’t find the cause…

What is this line supposed to do? I’m pretty sure it’s throwing you an error.

gl.glEnableClientState(GL2.GL_ARRAY_BUFFER);

Exactly what are your arguments to gl.glVertexAttribPointer()? I don’t see you passing GL_FLOAT as data type anywhere…

Point 1: ehmm… don’t you have to enable the array buffer if you want to draw with glVertexAttribPointer? … I’ll try to remove it…

Point 2: Oh yes, I forgot to mention, this is what I posted, would have been printed for the exact arguments:

It shows “type: 5126” and GL_FLOAT is 0x1406, which is 5126…

I should have probably pointed that out and printed it with %x…

GL_ARRAY_BUFFER is the most common VBO type. It has nothing to do with vertex arrays and is not an accepted enum for glEnableClientState(). http://www.opengl.org/sdk/docs/man2/xhtml/glEnableClientState.xml

That looks fine assuming your attribute location is actually 0.

Anyway, I don’t see you actually putting your data into buffer. Isn’t it just a ByteBuffer filled with 0s right now? Never mind…

Oh. glDrawArrays() take in the number of vertices you want to draw, not the number of primitives. You have 4 vertices, not one. One vertex does not make a quad so it’s silently discarded.

Yeah… you’re right about the glEnableClientState… it throws an 1280 error (INVALID_ENUM)… I’ve removed it, but it’s still nor working :frowning:

This works, btw, which is not an interleaved array:


int aVertexAttrib = normalMapShader.getAttributeLocation("aVertex");
int aTexCoordsAttrib = normalMapShader.getAttributeLocation("aTexCoords");
int aNormCoordsAttrib = normalMapShader.getAttributeLocation("aNormCoords");
gl.glEnableVertexAttribArray(aVertexAttrib);
gl.glEnableVertexAttribArray(aTexCoordsAttrib);
gl.glEnableVertexAttribArray(aNormCoordsAttrib);
gl.glVertexAttribPointer(aVertexAttrib, 2, GL2.GL_FLOAT, false, 0, vert);
gl.glVertexAttribPointer(aTexCoordsAttrib, 2, GL2.GL_FLOAT, false, 0, texCoords);
gl.glVertexAttribPointer(aNormCoordsAttrib, 2, GL2.GL_FLOAT, false, 0, normCoords);

gl.glDrawArrays(GL2.GL_QUADS, 0, 4);

gl.glDisableVertexAttribArray(aVertexAttrib);
gl.glDisableVertexAttribArray(aTexCoordsAttrib);
gl.glDisableVertexAttribArray(aNormCoordsAttrib);

Oh god. THANK YOU! (ninja editor ^^)

Really… I can’t say how much I love you ! … This took me weeks… a programming mistake is ALWAYS easy… it works… thank you very much :slight_smile:

Been there, done that too, mate. :wink: