Graphics cards and Macs

So I’ve been experimenting with LWJGL and I have finally got a bit of an understanding of that basics, and then bam! I get this error:


Unable to create vertex shdaer: 
ERROR: 0:1: '' :  version '330' is not supported

Exception in thread "main" java.lang.IllegalStateException: Function is not supported
	at org.lwjgl.BufferChecks.checkFunctionAddress(BufferChecks.java:58)
	at org.lwjgl.opengl.GL30.glBindVertexArray(GL30.java:1512)
	at com.pickens.game.Game.dispose(Game.java:136)
	at com.pickens.game.Game.end(Game.java:216)
	at com.pickens.util.ShaderProgram.attachVertexShader(ShaderProgram.java:43)
	at com.pickens.game.Game.init(Game.java:52)
	at com.pickens.game.Game.gameLoop(Game.java:89)
	at com.pickens.game.Game.<init>(Game.java:42)
	at com.pickens.game.Game.main(Game.java:222)

Ecumene said it meant that my Mac didn’t have a good enough graphics card to run my program. Does that mean I need to get a new computer? Because my Mac is only a year old so you would figure it could handle everything you throw at it right? I even updated to the most recent version and still I get this error.

So is the problem my code? Or the computer?

(BTW I was following SHC’s LWJGL tutorial)

Mac OS only returns a 3+ context if you explicitly request it for one. If you have LWJGL 2.9.1, then use this to create the Display.


PixelFormat pFormat = new PixelFormat();
ContextAttribs cAttribs = new ContextAttribs(3, 2).withProfileCore(true)
                                                  .withForwardCompatibile(true);

Display.create(pFormat, cAttribs);

If using LWJGL 3 with GLFW, then these window hints will get you a Core Context.


glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);
glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 2);
glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);
glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);

Another thing, even if you want OpenGL 4.1 or something, you must use 3.2 as the major and minor version numbers in Mac OS. This is because of the implementation details of the default OpenGL drivers. Since yours is a modern (only a year old, guessing the card as Intel HD 4000), you can get a context of OpenGL 4.1 by requesting 3.2 core context.

(Edit: Oh, you were following my series! Thanks!)

Sweet! That fixed it thanks! Now I’m getting this error from my fragment shader:


Unable to create fragment shdaer: 
ERROR: 0:3: Invalid use of layout 'location'
ERROR: 0:6: Use of undeclared identifier 'gl_Position'
ERROR: 0:6: Use of undeclared identifier 'vertex'

What could be the problem there?

Fragment Shader Code:


#version 330 core

layout(location = 0) in vec2 vertex;

void main() {
	gl_Position = vec4(vertex, 0.0, 1.0);
}

The problem is that you are using passing in vertex shader code as the fragment shader code. That’s the problem.

Haha wow, I must’ve got it reversed when I was reading the tutorial. Thanks!