I’ve tried to use OGL2.0 with a vertex shader, doing it with LWJGL - fails on compiling the vertex shader which is fine as the shader compiles fine when I use ARB extensions - could it be my gfx card which is a intel hd 3000 on my macbook pro?
Perhaps it is that you are not specifying the GLSL version you wanted. The ARB extension may only cater for the early versions of GLSL so it doesn’t matter, but the core defaults to the newest.
Try sticking: #version 120 (No semi-colon)
on the very first line of your shader.
private static boolean printLogInfo(int obj) {
IntBuffer iVal = BufferUtils.createIntBuffer(1);
glGetProgram(obj, GL_INFO_LOG_LENGTH, iVal); // <-- THIS LINE CAUSES CRASH, IF I REMOVE THEN OK
int length = iVal.get();
if (length > 1) {
// We have some info we need to output.
ByteBuffer infoLog = BufferUtils.createByteBuffer(length);
iVal.flip();
glGetProgramInfoLog(obj, iVal, infoLog);
byte[] infoBytes = new byte[length];
infoLog.get(infoBytes);
String out = new String(infoBytes);
System.out.println("Info log:\n" + out);
} else return true;
return false;
}
What is the crash? If it is an exception, post the stack trace.
And why are you using that horribly convoluted method? Look at the code I posted earlier:
int len = glGetProgrami(programHandle, GL_INFO_LOG_LENGTH);
String errLog = glGetProgramInfoLog(programHandle, len);
Don’t rely on the String to determine whether compilation was successful (sometimes a log will exist even if the shader is valid). Instead do something like this:
... compile shaders and attach them ...
//link our program
glLinkProgram(program);
//grab our info log
String infoLog = glGetProgramInfoLog(program, glGetProgrami(program, GL_INFO_LOG_LENGTH));
//if some log exists, print it to sys err
if (infoLog!=null && infoLog.trim().length()!=0)
System.err.println(infoLog);
//if the link failed, throw some sort of exception
if (glGetProgrami(program, GL_LINK_STATUS) == GL_FALSE)
throw new LWJGLException(
"Failure in linking program. Error log:\n" + infoLog);
Exception in thread "main" org.lwjgl.opengl.OpenGLException: Invalid operation (1282)
at org.lwjgl.opengl.Util.checkGLError(Util.java:59)
at org.lwjgl.opengl.GL20.glGetProgram(GL20.java:551)
at Square.printLogInfo(ShaderBox2.java:219)
at Square.createVertShader(ShaderBox2.java:192)
at Square.<init>(ShaderBox2.java:127)
at ShaderBox2.init(ShaderBox2.java:71)
at ShaderBox2.<init>(ShaderBox2.java:26)
at ShaderBox2.main(ShaderBox2.java:75)
My gfx card supports shading language upto 1.20 and OGL 3.0 (except shading language 1.30) - Using GLview it reports OpenGL version as 2.1 Apple-8.0.61.
Is there any tutorials which show rendering using a vertex shader, what I’ve seen they still use glBegin?!
You really need to learn how to search for error fixes…
Whenever you get an OpenGLException, simply search for the name of the method which throws it, in this case [icode]glGetProgram[/icode]
Then open the OpenGL Reference page link (usually 1st in Google search (I searched “glGetProgram”)), and the paper contains the point “Errors”, where you see whats the problem with each type of Error:
We search for our error: In your stacktrace lwjgl tells us ‘Invalid operation’, and there in the manual page it is: GL_INVALID_OPERATION.
Now whats the problem? The error is generated, if the program does not refer to a program object.
You never called [icode]glCreateProgram()[/icode] it seems. Or at least it’s return value is not what you give to check the log info…
Try to print ‘obj’ in your code. If it’s <= 0, then something is wrong.
Also, use davedes’ code. Or try out his tutorial. He explains it very well.
Also glGetProgram expects a program handle (from glCreateProgram). If you are trying to get the log of the shader object (from glCreateShader) you need to use glGetShaderi and glGetShaderInfoLog.
This is likely the source of your error since your stack trace is showing the glGetProgram call is coming from your createVertShader() method.