if i use shaders in one of my programs, should i use the shaders built into opengl 2.0 or the ARB shaders?
Use the 2.0 shaders since the ARB shaders are getting quite old. So long as your hardware supports it, I think the 2.0 shaders are worth learning more than some desperate attempt to support old/legacy hardware with people who might play the game. Additionally, there are newer shader versions than the one for OpenGL 2.0 but they only differ in the syntax of the GLSL code, all of the OpenGL glue is the same for 2.0 and up. The ARB shaders have very different glue, and I believe the language is closer to assembly, which just isn’t fun to work with.
I agree with lhkbob, ARB shaders are very difficult to use, very low-level and that is not worth the effort except if lots of people using your game own such an hardware. Personally I prefer not using shaders on graphics cards that supports only ARB shaders.
only problem is that both the lwjgl demo shaders and the one at http://lwjgl.org/wiki/index.php?title=GLSL_Shaders_with_LWJGL both use ARB shaders. also, the lwjgl demo works both on my home computer and school computer, while a program i made with the 2.0 shaders only works on my machine.
I was under the impression all shaders were built on top of low level shaders, so low level shaders will be supported for a long time to come.
I think here is a small misunderstanding. There are legacy ARB shaders with low level (assembly) code, but in this thread it’s only about if one should use GLSL shaders via the ARB or the 2.0 functions. Since both are more or less the same with the only difference being a ARB prefix or suffix, it only is a matter of which OpenGL-Version you want to support.
I would go with the 2.0 stuff (drivers should be there by now). Simple leave out ARB suffixes and prefixes in the examples you find and you should be fine…
You can write 2 wrappers around both ARB and core functions, then choose the appropriate implementation at runtime.
OK that makes sense.
can anybody help me convert this to return the modelviewprojection matrix?
package com.google.sites.lvgeeks.deepthought.utils.opengl.lwjgl;
import org.lwjgl.opengl.;
import org.lwjgl.util.glu.;
public class AnotherBloodyCamera {
public static void look(int FOV,float aspectratio, float atx, float aty, float atz,float yaw, float pitch,float roll)
{
yaw+=180;
while(pitch>360||pitch<0)
{
if(pitch>360)
{
pitch-=360;
}
if(pitch<0)
{
pitch+=360;
}
}
while(yaw>360||yaw<0)
{
if(yaw>360)
{
yaw-=360;
}
if(yaw<0)
{
yaw+=360;
}
}
while(roll>360||roll<0)
{
if(roll>360)
{
roll-=360;
}
if(roll<0)
{
roll+=360;
}
}
double yawrad = (Math.PI*yaw)/180;
double pitchrad = (Math.PI*pitch)/180;
float lookz=(float) (atz+(30*Math.cos(yawrad)));
float lookx=(float) (atx+(30*Math.sin(yawrad)));
float looky=(float) (aty+(30*Math.tan(pitchrad)));
GL11.glMatrixMode(GL11.GL_PROJECTION);
GL11.glLoadIdentity();
GLU.gluPerspective(FOV, aspectratio, 1f, 9000f);
GL11.glMatrixMode(GL11.GL_MODELVIEW);
GL11.glLoadIdentity();
GL11.glRotatef(-roll, 0, 0, 1);
GLU.gluLookAt(atx, aty, atz, lookx, looky, lookz, 0, 1, 0);
//System.out.println(lookx+" "+ looky+" "+ lookz);
}
}
thanks! 