[SOLVED]Problem with Transformation Matrices passed into GLSL with LWJGL

Hey All, I really hate to do this, but after a few hours of googling and tried to find a solution, JGO you are my last hope. As an effort to learn 3D programming with Java, I decided to port the Modern OpenGL Programming Wikibook (http://en.wikibooks.org/wiki/OpenGL_Programming#Modern_OpenGL) over to Java/LWJGL, and I’m hosting the code over at a github repo: (https://github.com/alihelmy/lwjglTutorial)

I had been going fine, until I got to lesson 4 (http://en.wikibooks.org/wiki/OpenGL_Programming/Modern_OpenGL_Tutorial_04) that deals with transformation matrices and passing them as a uniform variable to vertex/fragment GLSL shaders. The code works with no errors, but there is absolutely nothing drawn on the screen.

I have tried replacing the transformation matrix with an identity matrix to no avail. Removing the glUniformMatrix4() call & transformMatrix usage in the shaders, reverts code to working correctly

You can take a look at the complete code HERE: (https://github.com/alihelmy/lwjglTutorial/tree/master/LessonFour) and for brevity, I have included here the more “important” parts IMHO

loading up of transformationMatrix attribute and preparing buffer, happens once:
String transformMatrixAttributeName=“transformMatrix”;
transformMatrixAttributeIndex=glGetUniformLocation(shaderProgram,transformMatrixAttributeName);
transformationValues=BufferUtils.createFloatBuffer(16);

initialisation of transformation Matrix & loading into FloatBuffer, this happens every frame
float movement=(float)Math.sin(timeElapsed/1000.0 * (2.314)/5);
float angle=(float)(timeElapsed/1000.0*45)%360;
Vector3f zAxis=new Vector3f(0f, 0f, 1f);
Matrix4f translationMatrix=new Matrix4f();
translationMatrix.setIdentity();
translationMatrix.translate(new Vector3f(movement, 0f, 0f));
Matrix4f rotationMatrix=new Matrix4f();
rotationMatrix.setIdentity();
rotationMatrix.rotate(angle, zAxis);
Matrix4f transformationMatrix=Matrix4f.mul(translationMatrix, rotationMatrix, null);
transformationValues.clear();
transformationMatrix.store(transformationValues);

glUniformMatrix4() call, in every draw:
[…]//other uniform attributes called here, such as fade
glUniformMatrix4(transformMatrixAttributeIndex, false, transformationValues);

vertex shader code:
#version 120
attribute vec3 coord3D;
attribute vec3 vColor;
varying vec3 fColor;
uniform float fade;
uniform mat4 transformMatrix;
void main(void) {
gl_Position = transformMatrix * vec4(coord3D, 1.0);
fColor = vColor;
}

drawing itself is done by passing vertices and their colours through glVertexAttributePointer() calls, and then drawn through glDrawElements()

Again, the code itself works pretty fine without the transformation matrices, which leads me to think that is where the true problem lies. You can always take a complete look at the github repo mentioned above

Thanks a lot for any help guys. I know this may be tiresome to look at and try to trace the code like this, but I really am at a loss, and could really use a hand

You forgot to flip the matrix float buffer? xD
EDIT: Yeah, the example is missing that too. xDDDDDDDD

[quote=“theagentd,post:2,topic:37362”]
You, Sir/Madam, ARE AWESOME! Thank you VERY much…
I guess that’ll stick in my head this time :smiley:

[quote=“codemonkey,post:3,topic:37362”]

Thank you. That would be Sir. xD Glad to help!