Easiest Way to Pass Textures to Fragment Shader

Hey guys,

I’m pretty new to shaders, so I’m a little bit stuck. What i’m trying to do should be easy. I’m trying to get the current pixel and multiply it by another color to get kind of a blue hue over the entire screen (not including the HUD). The end goal is to kind of have a night type feel to everything by doing this.

It seems like every example I am finding uses the SpriteBatch class with LibGDX or Slick to handle passing texture data to the shader. I already have a large portion of rendering code in place and I’d like to find a way to do something like this:

ARBShaderObjects.glUseProgramObjectARB(shaderProgram);
updatePlayingField(0);		
Actor.updateCast();								
player.update();	
updatePlayingField(1);
Particle.updateAll();
ARBShaderObjects.glUseProgramObjectARB(0);

FloatingText.updateAll();
updateHUD();

That would be in my main game loop. Basically I have a simple shader working now, but all it does is change every pixel to a specific color. All of the individual sprites are drawn in each of those methods in that loop. It doesn’t seem possible that I can just get the “current” pixel color value when the fragment shader is run sadly… :frowning:

Is there an easy way to handle shaders without re-writing all of my existing rendering code and over-complicating the systems I already have in place? I guess another option might be that I could apply a shader to the entire buffer before rendering the HUD? I don’t know - any and all suggestions are welcome. More code available upon request - didn’t want to overcomplicate things since I have basic shaders already working. ;D

  • Steve

You don’t “pass” a texture to a shader. You bind a texture and pass texture coordinates for each vertex. The vertex shader passes the texture coordinates down the pipeline to each fragment, with the value interpolated between the vertices. In the fragment shader, you then need to sample from the currently bound texture at unit N by using the texture coordinates for the given fragment.

Firstly; ARB is old school. Use GL20 instead. Secondly, your rendering code shouldn’t need to change dramatically. You just pass colors for each vertex, along with position and texture coordinates. Then you pass the color attribute along to the vertex shader like so:

attribute vec4 a_color;
varying vec4 v_color;

void main() {
    //pass input color to frag shader
    v_color = a_color;

    ... output gl_Position and pass texcoords along ... 
}

(If you’re using newer GLSL, then it’s “in” and “out” syntax)

Then you multiply the current color with the vertex color (which is in your case will be blue). Your vertex shader looks like:

uniform sampler2D tex;
varying vec4 v_color;
varying vec2 v_texCoord;

void main() {
    vec4 texColor = texture2D(tex, v_texCoord);
    gl_FragColor = texColor * v_color;
}

If all of this is gibberish, then you really need to start from the beginning and learn OpenGL in its modern form. :wink:

And if you want to write your own shader management, see here:

So the way I’m reading this… when you bind a texture to a quad, it should automatically be handed off to the vertex shader where it can, in turn, be processed by the fragment shader?

Do you need to manually modify any uniforms in the shaders for the texture or anything?

Great tutorials, by the way. I was going off the LWJGL ones which use the ARB stuff. I’ve done a decent amount of 2D game dev in the past, but never really messed with shaders til now; seems like you can do some really cool stuff with them.

  • Steve

Well, you aren’t “passing” a texture to a shader. You just bind one (or multiple) textures to the GL state, and they stay bound until you bind a different texture.

Then you sample (i.e. “texture fetch”) from the fragment shader. The fragment shader uses sampler2D uniforms to determine which texture unit to sample from. More info on texture sampling and multiple texture units in the following lessons:


Good luck… :slight_smile:

back to your original question, the bluish night effect.

First of all the current color value of the screen is not accessible in the fragment shader, but …

When you just want to tint the screen a bit, you can just render something transparent(alphablending) over the hole screen. Or you could add some uniform to your normal shader which enables disables the night effect for each rendered object.


uniform float nightAmount = 0;
const vec3 NIGHT_COLOR = vec3(0, 0, 0.9);

void main(){
vec3 color;
//normal shader stuff
gl_FragColor = mix(color, NIGHT_COLOR, nightAmount);
}

So I am messing around with the rendering method for individual sprites. I think I’m starting to get an idea of how things’ll work, but I still have to bind the TexCoord to the vertex shader as an attribute, correct?

// Bind the current texture to the current shader if one is in use.
if(useShader)
{
     setShaderUniform(currentShader, "texture", getTextureId(temp));
}
				
temp.bind();
				
float fSrcX = ((float)srcX / temp.getTextureWidth());
float fSrcY = ((float)srcY / temp.getTextureHeight());
float fSrcWidth = (((float)srcX + (float)srcWidth) / temp.getTextureWidth());
float fSrcHeight = (((float)srcY + (float)srcHeight) / temp.getTextureHeight());
				
GL11.glColor4f(red, green, blue, alpha);
GL11.glBegin(GL11.GL_QUADS);
// Top Left
GL11.glTexCoord2f(fSrcX, fSrcY);
GL11.glVertex2f(x,y);
// Top Right
GL11.glTexCoord2f(fSrcWidth, fSrcY);
GL11.glVertex2f(x + width, y);
// Bottom Right
GL11.glTexCoord2f(fSrcWidth,fSrcHeight);
GL11.glVertex2f(x + width,y + height);
// Bottom Left
GL11.glTexCoord2f(fSrcX,fSrcHeight);
GL11.glVertex2f(x,y + height);
GL11.glEnd();

Thanks for answering the endless of array of stupid questions. :slight_smile:

  • Steve

Hmm, then I find pages like this where it makes me think those can just be populated by something like this in the vertex shader:

void main()
{
    // Transforming The Vertex
    gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;

    // Passing The Texture Coordinate Of Texture Unit 0 To The Fragment Shader
    texture_coordinate = vec2(gl_MultiTexCoord0);
}

From - http://stackoverflow.com/questions/11537356/pass-through-vertex-shader-for-texture-mapping

I keep thinking about this in terms of me setting an attribute from my Java code in the shader program…

  • Steve

You don’t give the texture ID to the shader. Instead, you give it the “texture unit” – which is by default zero (GL_TEXTURE0). Then the shader will sample from whatever texture is bound at the specified texture unit.

//specify the active texture unit
//NOTE: this is optional since GL_TEXTURE0 is the default state
glActiveTexture(GL_TEXTURE0); 

//this binds the texture ID to the active texture unit, 
//which we set to be "unit zero"
glBindTexture(GL_TEXTURE_2D, myTexID);

...

//bind your shader
glUseProgram(programID);

//find location of "tex" uniform in frag shader
int texUniformLoc = glUniformLocation("tex");

//tell your frag shader to sample from "unit zero"
//use int literal, not GL_TEXTURE* constants
glUniform1i(texUniformLoc, 0);

...

//now render your quad with vertex positions & texture coordinates
....

[quote]I keep thinking about this in terms of me setting an attribute from my Java code in the shader program…
[/quote]
You can pass information to the shaders in two ways: as an attribute per-vertex or as a uniform per batch. Something like texture coordinates needs to be passed as an attribute (glTexCoord2f) since it is different for each vertex.

Like I said – read up on some tutorials and books about OpenGL. It looks like you are using old-school attributes (gl_MultiTexCoord0) and outdated libraries (SlickUtil), which I wouldn’t recommend if you’re trying to learn OpenGL in the 21st century.