OpenGL Shaders not working

So… In my personal framework for LWJGL 3 (that I started yesterday), I tried to implement shaders. I’m all new to this shader Modern OpenGL thing, and I know OpenGL only for about half of a year, so I expected some mistakes. However, the code that I have should work, but it doesn’t. My problem is: I’m drawing a triangle to screen, and I’m trying to change it’s color using GLSL. But the triangle stays white. No matter what I do, it stays white. I spent about two hours trying to fix it, looking for any errors in OpenGL, browsing through the Internet, but nothing. Everything seems perfect. Except the god-damn triangle.

So, hope you guys can help me.
Here’s my code related to shaders (sorry, it’s a “little” messy):
(Main.java) http://pastebin.com/S8tQbjkZ
(Shader.java) http://pastebin.com/NQUq9ZsU
(vert.glsl) http://pastebin.com/0MjS4F94
(frag.glsl) http://pastebin.com/6Znfd6bV

(Window.java) http://pastebin.com/JXsnFjYB (I don’t think it’s really important, but all the basic OpenGL stuff is there, so, yeah)

Thanks in advance.

What about setting the color value in the Fragment and not in the Vertex?


//SIMPLE VERTEX SHADER
void main(){
	gl_Position = gl_Vertex;
}

//SIMPLE FRAGMENT SHADER
void main(){
	gl_FragColor = vec4(0.5, 1, 0.25, 1);
}

Already tried it, still same result… :-\

Since you didnt say what have you tried, Ill tell whatever comes to my mind.

Setting the color to 0 does render a white triangle too?


I use libgdx or plain OpenglGL in C/C++ so im not used to lwjgl.
Is the screen being cleared each frame? (I dont see the gl_Clear).

Okey, thats in the Windows class, try using glClearColor(0.0f, 0.0f, 0.0f, 1.0f);

Setting glClearColor to 0, 0, 0, 1 doesn’t change anything.

Yep, changing the color to 0 gives the white triangle once again. Which means, if I’m correct, that the shader isn’t even used. Hmm…

You are missing a glUseProgram :point:

        public void use()
        {
                glUseProgram(id);
        }

Already in the Shader class.

Thanks,

You are creating your window (and hence your GL context) on an auxiliary thread. All openGL commands and window commands must be run on the main thread.

see the note under glfwCreateWindow here: http://www.glfw.org/docs/latest/group__window.html

How come then that the window is created successfully?

I am not sure about the exact behavior this would cause; however, you are doing a glfwInit and gl calls on different threads. I could imagine this could cause some undesired behavior

They’re called on the same thread. The Window object has it’s own thread and Game object. All the methods you see in the inner Game class in Main are actually called in the Window’s thread. Trust me, I’ve read the documentation of GLFW pretty well. “All the GLFW and OpenGL calls must be on the same thread”. That’s exactly what I did.
`
new Thread(new Runnable()
{
@Override
public void run()
{
if ((id = glfwCreateWindow(width, height, title, NULL, NULL)) == NULL)
throw new RuntimeException(“Window couldn’t be created”);

                            ByteBuffer vidMode = glfwGetVideoMode(glfwGetPrimaryMonitor());

                            glfwSetWindowPos(
                                                                            id,
                                                                            (GLFWvidmode.width(vidMode) - width) / 2,
                                                                            (GLFWvidmode.height(vidMode) - height) / 2
                            );

                            glfwMakeContextCurrent(id);

                            GLContext.createFromCurrent();

                            glViewport(0, 0, width, height);
                            glMatrixMode(GL_PROJECTION);
                            glOrtho(0, width, 0, height, -1, 1);
                            glMatrixMode(GL_MODELVIEW);
                            glLoadIdentity();

                            glClearColor(0.0f, 0.0f, 0.0f, 0.0f);

                            glfwShowWindow(id);

                            game.init();

                            while ( glfwWindowShouldClose(id) == GL_FALSE ) {
                                    update();
                                    render();
                            }

                            glfwDestroyWindow(id);

                    }
            }).start();

`
From the constructor of the Window class.

Are you f**king kidding me LOL?!? Good luck with the rest of your life with this much arrogance.

Window.InitGLFW is the first call of the main function; hence, it is run on the main thread.

The rest of the calls are on an auxiliary thread. The documentation of GLFW clearly says do all of this on the main thread (it should however work if everything is done on the same thread).

You don’t have to listen to me, good luck figuring out this problem.

glfwInit is called on the main thread.
From the main method:
Window.initGLFW();

And from the Window class:
`
public static void initGLFW()
{
glfwSetErrorCallback(ErrorCallback.Util.getDefault());

            if (glfwInit() != GL_TRUE)
                    throw new RuntimeException("GLFW couldn't be initialized");

            glfwDefaultWindowHints();
            glfwWindowHint(GLFW_VISIBLE, GL_FALSE);
            glfwWindowHint(GLFW_RESIZABLE, GL_TRUE);
    }

`

I have thought about the windowing system well. And I see no need to be offensive.

We are going in circles, you are calling the glfwInit in the main thread and the rest on an auxiliary thread

All glfw and gl calls must be done on the main thread

Alright, as you say, I’ll try calling everything on the main thread, and we’ll see if there’s any difference.

No difference. Still the same white triangle I saw 30 times before.

Throw a glGetError into your in you render method and have it println any errors you may have.

Also please post your new relevant code.

edit:
If there are no errors try using VBOs instead and see if it works, This could very easily be because you are mixing immediate mode with shaders, which has undefined behavior due to different drivers

glGetError gives GL_INVALID_OPERATION. Be right on the new code.

I gotta go right now,

Try putting glGetErrors after every opengl call see which call causes the error.

If it is glInit then this error is a false alarm, it is caused by GLEW which is a C library that I imagine LWJGL depends on, it is a known issue caused by an early glGetString I believe, it is no reason to get alarmed and means nothing and [ramble ramble ramble]…

If there are no other errors then there are no errors making me believe that this error is caused by you mixing shaders with immediate mode. If that is the case, start learning more modernGL such as VBOs and etc. Try this program again with a VBO and it should work as I see nothing else wrong (given I havent looked at it for more than 10-20 minutes)

If there are other errors well now you know which call is causing it and it should be easier to work from there :slight_smile:

Ok, thanks. I wanted to implement VBOs right after shaders anyway, guess I’ll just do it sooner.