rendering with shaders

i’ve sucessfully managed to load and link a fragment and vertex shader. now hos do i render with it? i’ve got as far as the GluseProgram call. my vertex shader is


#version 120

uniform mat4 projection_matrix;
uniform mat4 modelview_matrix;

attribute vec3 a_Vertex;
attribute vec3 a_Color;
varying vec4 color;

void main(void) 
{
	vec4 pos = modelview_matrix * vec4(a_Vertex, 1.0);
	gl_Position = projection_matrix * pos;	
	color = vec4(a_Color, 1.0);
}

and the fragment shader is

#version 120

varying vec4 color;

void main(void) {
vec4 anothercolor;
anothercolor.r=255;
anothercolor.g=255;
anothercolor.b=255;
anothercolor.a=255;
	gl_FragColor = color;	
}

also, is it possible to have more than 1 fragment or vertex shader in a program?

the glUseProgram call sets the program to be used instead of the fixed function. If there are errors within the program, no rendering will occur. Otherwise, you should be able to render as you normally would (i.e. with vertex arrays or glVertex calls). Also, you should be able to have multiple vertex or fragment shaders, but once combined there can only be one main() method, and they can’t conflict with each other.

haven’t i got to send the projection and modelview matrix in as uniforms and vertex positions and colors as attributes? how do i do that?

Mmm, yes that is very true, I didn’t look at the specifics of your shader (with some it’s possible to still use glMultMatrix, etc. if you use specially defined uniforms but it’s probably best you’re moving away from that).

After you’ve compiled and linked the program, you need to determine the “location” of the uniforms. This can be done by calling glGetUniformLocation(program_id, uniform_name). This returns an integer, which you can than pass to calls of glUniform along with the actual values. There are multiple glUniform functions for various data types and vector sizes, so you’ll have to use the one appropriate for the uniform type.

You must perform a similar procedure with the vertex attributes. Since you’re working with shaders, there is no more well defined ‘vertex’ and ‘normal’ and ‘texture coordinate’. Instead every model is just a set of generic vertex attributes that are bound to arbitrary integer slots. Use glGetAttributeLocation the same way you’d get a uniform to query the attribute slots for your “a_Vertex” and “a_Color”.

There are two ways to render with generic attributes. If you’re still using glBegin/glEnd, you call glVertexAttrib(index, values) where index is the attribute location previously found. If you’re using vertex arrays, you use glVertexAttribPointer(index, …) instead of the regular glVertexPointer, glNormalPointer etc. With the vertex arrays, you must enable the attribute using glEnableVertexAttribArray(index), and perform the actual drawing using a call such as glDrawElements or glDrawArrays.

If you haven’t done vertex arrays before, I’d recommend sticking with the glBegin/end method but they’re important to learn at some point.

Ok i got it to render, but if i give each vertex a different color, the whole shape is the color from the first vertex. what’s wrong with the shader?

Always use the modelview and projection matrices. They will automatically setup (at zero cost) your normalmatrix and modelviewprojectionmatrix, and are basically meant for the job.

Just keep in mind that the built-in matrices have been deprecated and are available in the compatibility profile only.

i know i’ll have to compute my own matrices eventually, but the problem now is that the variable color in the shader isn’t getting interpolated. i’m using different colors for each vertex, but the triangle is the color of the first color attribute i passed in.

Could you post your updated shader code and your Java (assuming it’s short enough) for me to look at. Did you look up the attribute bindings and check the program compile logs?

ok the problem wasn’t interpolation. the problem was that my quad was so close to the camera that it looked solid red. now i,ve gotta figure out how to get my camera set up right.

for some odd reason when i was tinkering with the vertex positions, sometimes the triangle would disappear. one time it ended up looking as if someone had cut small triangles out of one of the sides in regular intervals. why is that?has this happened to anyone else or is my code messed up? the shader should be fine, because i got it out of an opengl book.

It’s hard to say if your code is messed up if you don’t post any examples :slight_smile: but I think something like that has happened to me if I set my near frustum plane to 0 before. Doing that certainly creates weird issues. This, and your last problem about being too close to the triangles makes it sound like your matrix/camera code might need some work.

i don’t know. i’m probably a bit out of my league right now. i’ll just keep to fixed function and maybe convert one of my JOGL projects to lwjgl before working on shaders any more.

[quote=“deepthought,post:11,topic:35846”]
That effect pccurs when you’ve got a triangle that’s being intersected with the far plane of your view frustum, and the Z-buffer hasn’t quite the resolution to smoothly intersect it (possibly because it’s extremely far away?)

I might be wrong.

Cas :slight_smile:

arrghh someone deleted my program, and i had to rewrite it.

right now i’ve got


import java.awt.*;
import java.io.File;
import java.nio.FloatBuffer;
import java.nio.IntBuffer;
import java.util.Vector;








import org.lwjgl.BufferUtils;
import org.lwjgl.LWJGLException;
import org.lwjgl.opengl.*;
import org.lwjgl.opengl.DisplayMode;
import org.lwjgl.util.glu.GLU;
import org.lwjgl.util.vector.Matrix4f;
import org.lwjgl.util.vector.Vector4f;
import org.newdawn.slick.opengl.TextureLoader;
import org.newdawn.slick.util.*;


import com.google.sites.lvgeeks.deepthought.utils.opengl.lwjgl.*;



public class lwjgltest {
DisplayMode mode;
int frag, vert,prog;
/**
	 * Sets up the screen.
	 * 
	 * @see javax.media.openGL11.GLEventListener#init(javax.media.openGL11.GLAutoDrawable)
	 */

public static void main(String args[])
{
	new lwjgltest();
}

public lwjgltest()
{
	init();
	while(!Display.isCloseRequested())
	{
		
		render();
		Display.update();
	}
}

/**
 * Sets up the screen.
 * 
 * @see javax.media.opengl.GLEventListener#init(javax.media.opengl.GLAutoDrawable)
 */
public void init() {

try {
	Display.create();
	mode = Display.getDisplayMode();
} catch (LWJGLException e) {
	
	e.printStackTrace();
}
	
    // Enable z- (depth) buffer for hidden surface removal. 
    GL11.glEnable(GL11.GL_DEPTH_TEST);
    GL11.glDepthFunc(GL11.GL_LEQUAL);

    // Enable smooth shading.
    GL11.glShadeModel(GL11.GL_SMOOTH);

    // Define "clear" color.
    GL11.glClearColor(0f, 0f, 0f, 0f);

    // We want a nice perspective.
    GL11.glHint(GL11.GL_PERSPECTIVE_CORRECTION_HINT, GL11.GL_NICEST);

 
    frag = GL20.glCreateShader(GL20.GL_FRAGMENT_SHADER);
	vert = GL20.glCreateShader(GL20.GL_VERTEX_SHADER);
	prog = GL20.glCreateProgram();
   
	glutil.shader(frag, new File("shaders/test.frag"));
	glutil.shader(vert, new File("shaders/test.vert"));
 
    GL20.glAttachShader(prog, vert);
    GL20.glAttachShader(prog, frag);
    glutil.link(prog);
   
    GL20.glUseProgram(prog);
 
    
}


public void render()
{

	float vertices[] = new float[]{20,0,0,
			20,20,0,
			0,20,0};
	float colors[] = new float[]{1,0,0,
			0,1,0,
			0,0,1};
	
	
	
	 FloatBuffer verticesBuffer = BufferUtils.createFloatBuffer(vertices.length);
     FloatBuffer colorsBuffer = BufferUtils.createFloatBuffer(colors.length);
     verticesBuffer.put(vertices).flip();
     colorsBuffer.put(colors).flip();

 	AnotherBloodyCamera.look( 45, mode.getWidth()/mode.getHeight(), 5, 5, 200, 0, 0,0);
     
     
     int vtx = GL20.glGetAttribLocation(prog,"a_Vertex");
     int clr = GL20.glGetAttribLocation(prog,"a_Color");
     
     GL20.glEnableVertexAttribArray(vtx);
     GL20.glEnableVertexAttribArray(clr);
     
     int mvm = GL20.glGetUniformLocation(prog, "modelview_matrix");
     int prm = GL20.glGetUniformLocation(prog, "projection_matrix"); 
	
     FloatBuffer mm = BufferUtils.createFloatBuffer(16);
     FloatBuffer pm = BufferUtils.createFloatBuffer(16);
     
 GL11.glGetFloat(GL11.GL_MODELVIEW_MATRIX, mm);
 GL11.glGetFloat(GL11.GL_PROJECTION_MATRIX, pm);
  
GL20.glUniformMatrix4(mvm, false, mm);
GL20.glUniformMatrix4(prm, false, pm);


 GL20.glVertexAttribPointer(vtx, 3, false, 0, verticesBuffer);
 GL20.glVertexAttribPointer(clr, 3, false, 0, colorsBuffer);
 
 GL11.glDrawArrays(GL11.GL_TRIANGLES, 0, 3);
 

	
     
	
	
  
     
	
}
}

and 

package com.google.sites.lvgeeks.deepthought.utils.opengl.lwjgl;
import org.lwjgl.opengl.*;
import org.lwjgl.util.glu.*;
public class AnotherBloodyCamera {

	public static void look(int FOV,float aspectratio, float atx, float aty, float atz,float yaw, float pitch,float roll)
	{
		yaw+=180;
		while(pitch>360||pitch<0)
		{
		if(pitch>360)
		{
			pitch-=360;
		}
		if(pitch<0)
		{
			pitch+=360;
			
		}
		}
		
		while(yaw>360||yaw<0)
		{
		if(yaw>360)
		{
			yaw-=360;
		}
		if(yaw<0)
		{
			yaw+=360;
		}
		}
		
		
		while(roll>360||roll<0)
		{
		if(roll>360)
		{
			roll-=360;
		}
		if(roll<0)
		{
			roll+=360;
		}
		}
		
		
		
		double yawrad = (Math.PI*yaw)/180;
		double pitchrad = (Math.PI*pitch)/180;
		
		float lookz=(float) (atz+(30*Math.cos(yawrad)));
		float lookx=(float) (atx+(30*Math.sin(yawrad)));
		float looky=(float) (aty+(30*Math.tan(pitchrad)));
		
		 
		GL11.glMatrixMode(GL11.GL_PROJECTION);
        GL11.glLoadIdentity();
		
		GLU.gluPerspective(FOV, aspectratio, 1f, 9000f);
		
		  GL11.glMatrixMode(GL11.GL_MODELVIEW);
		  GL11.glLoadIdentity();
		 
		  GL11.glRotatef(-roll, 0, 0, 1);
		  
		  
		  GLU.gluLookAt(atx, aty, atz, lookx, looky, lookz, 0, 1, 0);
		
		
        //System.out.println(lookx+" "+ looky+" "+ lookz);
      
	}
	
}

but it just gives me a black screen whenever i render. why is that?