Getting Surface Normals Right

I have a class that takes an arbitrary set of data points which represent a whole bunch of quads to make up a surface. The user gives this 3d matrix, and my class should render it. I have that much working, but now I want to make it so that proper normals are created for lighting… but I don’t know how to do it for arbitrary shapes. Alan_W gave me a start using some code he made for doing something like what I’m after, but the results don’t come out right. The current result is that it looks like I’m not calculating any normals at all. I don’t know what’s wrong, but something is :slight_smile: Perhaps someone could help me figure out how to calculate these normals? Here is my current attempt:


public void setData (double[][][] data) {
	this.data = data;
		
	normals = new Vec3D[data.length][data[0].length][2];
		
	for (int j = 0; j < data.length - 1; j++) {
		for (int i = 0; i < data[j].length - 1; i++) {
			Vec3D uu = new Vec3D();
			uu.subtract(
					new Vec3D(data[j+1][i][0], data[j+1][i][1], data[j+1][i][2]),
					new Vec3D(data[j][i][0], data[j][i][1], data[j][i][2]));
			Vec3D vv = new Vec3D();
			vv.subtract(
					new Vec3D(data[j+1][i+1][0], data[j+1][i+1][1], data[j+1][i+1][2]),
					new Vec3D(data[j][i][0], data[j][i][1], data[j][i][2]));
			normals[j][i][0] = new Vec3D();
			normals[j][i][0].cross(uu, vv);
			normals[j][i][0].normalize();
				
				
			uu.subtract(
					new Vec3D(data[j][i+1][0], data[j][i+1][1], data[j][i+1][2]),
					new Vec3D(data[j+1][i+1][0], data[j+1][i+1][1], data[j+1][i+1][2]));
			vv.subtract(
					new Vec3D(data[j][i][0], data[j][i][1], data[j][i][2]),
					new Vec3D(data[j+1][i+1][0], data[j+1][i+1][1], data[j+1][i+1][2]));
			normals[j][i][1] = new Vec3D();
			normals[j][i][1].cross(uu, vv);
			normals[j][i][1].normalize();
		}
	}
}

and in my render method...
for (int j = 0; j < data.length - 1; j++) {
	gl.glBegin(GL.GL_TRIANGLES);
	for (int i = 0; i < data[j].length - 1; i++) {
		gl.glNormal3d(normals[j][i][0].x, normals[j][i][0].y, normals[j][i][0].z);
		gl.glVertex3dv(data[j][i]);
		gl.glVertex3dv(data[j+1][i]);
		gl.glVertex3dv(data[j+1][i+1]);
					
		gl.glNormal3d(normals[j][i][1].x, normals[j][i][1].y, normals[j][i][1].z);
		gl.glVertex3dv(data[j+1][i+1]);
		gl.glVertex3dv(data[j][i+1]);
		gl.glVertex3dv(data[j][i]);
	}
	gl.glEnd();
}

Hmmm maybe the vectors aren’t normalized correctly. I had this problem too, although I normalized the normals. Try using glEnable(GL.GL_RESCALE_NORMAL) or GL_NORMALIZE. Hope it helps :slight_smile:

no, neither of those made a change, thanks though. I figure I must be doing the math wrong, because I can get my cube (built of quads) normalized correctly when I give the proper glNormal calls. I guess I’m just hoping someone who’s comfortable with the math of this subject can spot my error or tell me I’m completely wrong and point me in the right direction ;D

EDIT: You know what’s frustrating? This actually makes things look pretty nice:


normals[j][i][0].x = Math.random()*5;
normals[j][i][0].y = Math.random()*5;
normals[j][i][0].z = Math.random()*5;
normals[j][i][0].normalize();

Well I can’t help you with math but another idea:
Have you debugged your application and looked at the calculated values for your nomal? Maybe the normals have the wrong direction (especially the z-value). Just try to use the opposite z-value (for example instead of 1 use -1)…

wow! I took your idea and it worked. Instead of
normals[j][i][1].cross(uu, vv)
I just use
normals[j][i][1].cross(vv, uu)

Now it’s beautiful = ;D

There’s a simple little trick to remember the orientation of the cross product of two vectors called the right hand rule. Might be helpful when thinking about this stuff…