Hello guys 
I’ve got a problem with my current VertexArray implementation. Not like most implementations in java it uses a ByteBuffer instead of a FloatBuffer. Probably this does not work, but it should, shouldn’t it?
The code for the VertexArray class is the following:
package org.matheusdev.arcengine.ogl;
import java.nio.ByteBuffer;
import java.util.List;
import javax.media.opengl.GL2;
import org.matheusdev.arcengine.utils.BufferUtils;
/**
 * @author matheusdev
 *
 */
public class VertexArray {
	protected final List<ShaderAttrib> attributes;
	protected final int totalComponents;
	protected final int totalByteLength;
	protected final ByteBuffer buffer;
	/**
	 * Alias for <code>VertexArray(glslShader.getAttributes(), vertices)</code>.
	 */
	public VertexArray(ShaderProg glslShader, int vertices) {
		this(glslShader.getAttributes(), vertices);
	}
	public VertexArray(List<ShaderAttrib> attributes, int vertices) {
		this.attributes = attributes;
		int components = 0;
		int stride = 0;
		for (ShaderAttrib attribute : attributes) {
			components += attribute.components;
			stride += attribute.getTypeByteSize();
		}
		this.totalComponents = components;
		this.totalByteLength = stride;
		this.buffer = BufferUtils.newByteBuffer(totalByteLength * vertices);
	}
	public void begin(GL2 gl) {
		gl.glEnableClientState(GL2.GL_ARRAY_BUFFER);
		int offset = 0;
		for (ShaderAttrib attribute : attributes) {
			buffer.position(offset);
			gl.glEnableVertexAttribArray(attribute.location);
			gl.glVertexAttribPointer(
					attribute.location,
					attribute.components,
					attribute.getDataType(),
					false,
					totalByteLength,
					buffer);
			System.out.printf("glEnableVertexAttribArray(%d)%n", attribute.location);
			System.out.printf("glVertexAttribPointer(%n\tindx: %d,%n\tsize: %d,%n\ttype: %d,%n\tnormalized: false,%n\tstride: %d,%n\tbuffer: %s(offset = %d));%n",
					attribute.location, attribute.components, attribute.getDataType(), totalByteLength, buffer, offset);
			offset += attribute.byteLength;
		}
	}
	public void render(GL2 gl, int mode, int offset, int count) {
		gl.glDrawArrays(mode, offset, count);
	}
	public void end(GL2 gl) {
		for (ShaderAttrib attribute : attributes) {
			gl.glDisableVertexAttribArray(attribute.location);
		}
		gl.glDisableClientState(GL2.GL_ARRAY_BUFFER);
	}
	@Override
	public String toString() {
		StringBuffer str = new StringBuffer();
		str.append(String.format("VertexArray[components: %d, stride: %d, attributes:\n", totalComponents, totalByteLength));
		for (ShaderAttrib attribute : attributes) {
			str.append("\t").append(attribute).append("\n");
		}
		str.append("];");
		return str.toString();
	}
	public ByteBuffer getBuffer() {
		return buffer;
	}
	public int getComponentSum() {
		return totalComponents;
	}
	public int getComponentByteSum() {
		return totalByteLength;
	}
}
I’ve tested the [icode]ShaderProg[/icode] class, it works perfectly with [icode]glBegin(); / glEnd();[/icode]. The array’s [icode].toString();[/icode] prints the following:
So the attributes of the shader are loaded properly. The stride is in bytes and seems to be okay (components = 2 = vec2, all floats, which means stride = 2 components * 4 bytes = 8 bytes).
The ByteBuffer is properly filled by me, with this code:
		onlyVerts = new VertexArray(Arrays.asList(
				onlyVertShader.getAttribute("aVertex")), 4);
		ByteBuffer buf0 = onlyVerts.getBuffer();
		buf0
			.putFloat(256).putFloat(0)
			.putFloat(384).putFloat(0)
			.putFloat(384).putFloat(128)
			.putFloat(256).putFloat(128)
			.flip();
It’s content is then tested with this and reset:
while (buf0.hasRemaining()) {
	System.out.printf("Vertex: aVertex(%G, %G)%n",
			buf0.getFloat(), buf0.getFloat());
}
buf0.position(0);
This is getting printed:
Finally this is getting print from the VertexArray#begin(gl) method:
The array is rendered like this:
onlyVertShader.use(gl);
onlyVerts.begin(gl);
onlyVerts.render(gl, GL2.GL_QUADS, 0, 1);
onlyVerts.end(gl);
The vertex shader is the following:
uniform mat4 uViewProjMatrix;
attribute vec2 aVertex;
void main(void) {
	gl_Position = uViewProjMatrix * vec4(aVertex, 0.0, 1.0);
}
The fragment shader:
uniform vec3 color;
void main(void) {
	gl_FragColor = vec4(color, 1.0);
}
The uniform “vec3 color” is set to (0, 1, 1); Setting uniforms works with my ShaderProg class, it is tested with glBegin(); / glEnd();.
The problem with this is, that NOTHING is drawn… just black. Other stuff drawn in the rendering loop with non-interleaved FloatBuffers and vertex arrays (not using the VertexArray class) and shaders work, and glBegin(); / glEnd(); work too.
Just the stuff which should be drawn with the VertexArray class is not drawn. Just black, transparent, nothing.
Probably this is due to the arguments I give to [icode]glVertexAttribPointer(…);[/icode], but I can’t find whats wrong with it 
I really need help! I’ve googled for years, as it feels now, and I have this issue longer than a week, and I just can’t find the cause…
 
      
    