[HELP] Converting Grayscale (InfraRed) to Color with JOGL [HELP]

Here’s a list of cards that support GL_ARB_fragment_shader
http://www.delphi3d.net/hardware/extsupport.php?extension=GL_ARB_fragment_shader
The 5200 series is in there. IIRC glsl support was initially only available as a beta feature and you had to change some registry setting to enable it. The latest driver revision should support it though, so maybe you need to update your driver?

I think, I made a stupid mistake again. I did not even installed the driver yet :frowning:

Ok Guys,

It seems the GLSL Program is running without any errors. No errors returned from the errorhandler. I shall try the GLSL Code of page 1 combine this with my own picture.
After that I shall report my status… ;D

Greedz Bahamut Lagoon

Ok, it seems that the whole GLSL thing is working fine. Thanks all, but we are not done yet. Thanks to this matter, a new question is comming up:

Is there any other way to do false coloring without using GLSL, CG and all other scripts. I mean, can OpenGL do this by using glReadPixels…blabla… glDrawPixels, or something like that???
And why is pixelMap badly supported???

Well the non-shader way I would do it would be to just take the incoming texture image, which changes every frame anyway, and convert it in Java. Then pass the converted texture into OpenGL. This is a fine way to do it if you don’t have shader hardware, but it takes a little more CPU time.

You could use pallettised textures, however newer hardware has stopped supporting these. And you’d be limited to (IIRC) a 256 colour gradient.

You mean that the convertion will not be done by OpenGL, but by Java.AWT, if I clearly understand what you are saying. But my CPU is allready 100% busy with everything. So what you are saying is that the shading is more advisable to use then the non-shading way. Correct me if I am not correct.

I got the pixelmap support thing from here. I haven’t verified this though
http://archives.seul.org/linuxgames/Apr-2004/msg00001.html
Also the minimum size that must be supported is only 32 values.
Maybe you could use this technique as a fallback code path if shaders are not available?

The shader shifts the color index to rgb value work to the gpu, which is probably preferable. The only downside I guess is that using shaders limits the hardware on which your program will work to the last 3 or 4 generations of cards. It’s up to you whether that’s a problem or not.

Hi there,

I fixed the GLSL script. This script allows me to change from grayscale to color. That is good, but when I use this script, I lost my black and the white. I am still very new with this… but I hope you guys helping me with my first step ;D. My colortable will like (from cold to warm) black, purple, red, yellow, white.

uniform sampler2D grayscaleTexture. By the way, I downloaded a great tool for testing the GLSL scripts. Check http://www.typhoonlabs.com 

vec3 convertHsvToRgb ( float H , float S , float V )
{
  int Hi = mod((H / 60.0), 6.0);
  float f = H / 60.0 - Hi;
  float p = V * ( 1.0 - S );
  float q = V * ( 1.0 - ( f * S ) );
  float t = V * ( 1.0 - ( ( 1.0 - f ) * S ) );
  vec3 rgb;

  if ( Hi == 0 )
    rgb = vec3 ( V , t , p );
  if ( Hi == 1 )
    rgb = vec3 ( q , V , p );
  if ( Hi == 2 )
    rgb = vec3 ( p , V , t );
  if ( Hi == 3 )
    rgb = vec3 ( p , q , V );
  if ( Hi == 4 )
    rgb = vec3 ( t , p , V );
  if ( Hi == 5 )
    rgb = vec3 ( V , p , q );
  return rgb;
}

void main ( )
{
  vec4 grayscale = texture2D ( grayscaleTexture , gl_TexCoord[0].st );

  float G = grayscale.r;
  float H = G * 360.0;
  vec3 rgb = convertHsvToRgb ( H , 1.0 , 1.0 );

  gl_FragColor = vec4 ( rgb, 0.0 );
}

void main ( )
{
  vec4 grayscale = texture2D ( grayscaleTexture , gl_TexCoord[0].st );

  float G = grayscale.r;
  float H = G * 360.0;
  vec3 rgb = convertHsvToRgb ( H , 1.0 , 1.0 );

   // just multiply the RGB values with the greyscale
   // to get your brightness back into the result.

   // adjust G if you want, like: G = G*x+(1.0-x)
   rgb *= G;

  gl_FragColor = vec4 ( rgb, 0.0 ); // shouldn't that be 1.0, or do you really want it to be fully transparant when blending is enabled?
}

Thanks a lot.
By the way… I now got 2 textures. But both are converted and I only want one texture to be converted. The problem is, that I got 2 layers at the same section. One layer is a real image and the other one (above) is the IR, which can become transparent. But both are converted, instead of the IR layer.
Can I use GLSL to select a layer or something like that?

Shaders get applied to all geometry which pass though them. If you only want a shader to apply to a specific group of polys then bind it beforehand, do your rendering and upbind it after. Be careful though as binding and unbinding a shader is one of the slowest operations avaliable.

Ok, maybe I found a better solution. The two textures have the same section (location). So to select the first texture I found out that I can use glGetUniformLocation and glUniform1iARB to set the Uniform variable of GLSL (which is a sampler2D) with a textureID.
The textureID came from glGenTextures, if I am correct. But the thing is, when I declared the glGetUniformLocation to “_infraRed”:


  uniform sampler2D grayscale;
  ....


  ....
  gl.glUseProgramObjectARB ( handle );
  _infraRed = gl.glGetUniformLocationARB ( handle, "grayscale" );
  System.out.println ( _infraRed );
  ....

the _infraRed seems to be -1. What did I do wrong this time??? With other words, using multitexturing, my whole screen turns black. AAHHHHHH!!!

Are you sure you’re program is set up correctly? Did you check the infolog for compilation/linking errors?

Incorrect. The sampler2D uniform should be set to the texture unit which that sampler should read from. Eg. you bind your texture id to texture unit 0, then set the sampler uniform to 0.

-1 means that the uniform couldn’t be found for some reason. Check the output of the compile and link stages.

The errorhandler is working fine and gives no error in return. Ok, I think I just make a stupid mistake. You mean that GL_TEXTURE0 + id = GL_TEXTUREid, that is:

[quote] gl.glGenTextures(1, ir_tex_id); //returns a pointer
gl.glActiveTexture(GL_TEXTURE0 + ir_tex_id ); //specify the texture unit <<<-----------
gl.glBindTexture(GL_TEXTURE_2D, ir_tex_id); //creating the texture object
gl.glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR); //set textureParam
gl.glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR); //set textureParam
gl.glTexImage2D(GL_TEXTURE_2D, _x, _y, ir_raw.width, ir_raw.height, 0, GL_RGB, GL_UNSIGNED_BYTE, ir_raw.data); //defines the texture
[/quote]
So, if I understand what you are saying: after glActiveTexture() is called, I can use the ir_tex_id as id for the uniformMethod?

After your call to glActiveTexture, you have to enable GL_TEXTURE2D in it


glActiveTexture(GL_TEXTURE0 + 0); glEnable(GL_TEXTURE_2D);
glActiveTexture(GL_TEXTURE0 + 1); glEnable(GL_TEXTURE_2D);
glActiveTexture(GL_TEXTURE0 + 2); glEnable(GL_TEXTURE_2D);

int addr0 = glGetUniformLocationARB(program, toByteString("myUniformName_A"));
int addr1 = glGetUniformLocationARB(program, toByteString("myUniformName_B"));
int addr2 = glGetUniformLocationARB(program, toByteString("myUniformName_C"));
glUniform1iARB(addr0, 0);
glUniform1iARB(addr1, 1);
glUniform1iARB(addr2, 2);

glActiveTexture(GL_TEXTURE0 + 2); glDisable(GL_TEXTURE_2D);
glActiveTexture(GL_TEXTURE0 + 1); glDisable(GL_TEXTURE_2D);
glActiveTexture(GL_TEXTURE0 + 0); glDisable(GL_TEXTURE_2D);

   private final ByteBuffer toByteString(String str, boolean isNullTerminated)
   {
      int length = str.length();

      if (isNullTerminated)
         length++;

      ByteBuffer buff = BufferUtils.createByteBuffer(length);
      buff.put(str.getBytes());

      if (isNullTerminated)
         buff.put((byte) 0);

      buff.flip();

      return buff;
   }

Thanks… yes… offcourse… I had a stupid convertion problem (string -> byte)… that might be the only way why it returns a -1

Are you sure thats legit? IIRC you’re supposed to use the constants GL_TEXTURE1, GL_TEXTURE2 etc. The fact that they’re all consecutive integers may or may not be a side effect of a specific implementation.