[HELP] Converting Grayscale (InfraRed) to Color with JOGL [HELP]

Hi there,

I need help to use false coloring for my infrared camera. The image is displayed in black and white: black for cold, and white for warm objects. I want to adjust this colortable so, that it can display the picture in white, yellow, red, purple and black. By doing this, a normal human being can see more details from the picture. I tried in Photoshop with the same picture. I did something like converting the grayscale image to a colored indexed picture. After that, I altered the colortable of it. And then “voila”, I got a perfect converted false colored picture.
I know that it is absolutly possible to do the same with JOGL. Can someone please give me some advises or a example code to make this possible???

Here below some technical details:

the image from the server which I called it “rawImage”, is stored in a byteBuffer type. This image is added to a RGBA texture. After that I paint it in a loop.

It looks like glTexImage using the GL_COLOR_INDEX format and glPixelMap should do what you want, but I’ve never used this myself.

Well since you already have the ByteBuffer image you could just preprocess it to do your color conversion there. You could come up with any conversion you like, but I would recommend HSV -> RGB where your Hue is directly from the grayscale, and S V are 1.0.

See:

For the equation there, first convert your grayscale value G into a 0-360 range. Since, in the above equations, your S and V are always 1.0, several of the terms really simplify out.

Then just spin through your ByteBuffer, convert the grayscales into a new RGBA bytebuffer and use that as your texture.

Your other option would be to do the work mentioned above in a pixel shader, which will almost certainly get you better performance if your GPU’s pixel units are otherwise idle.

[Reaction to "anarchotron "]
Thanks for the reply

Does the convertion from grayscale value G into a 0-360 range done by the CPU or the GPU??? My CPU is allready 100% busy, so if the convertion comes to CPU, then the FrameRate will critically drop back to 1-10 fps (theoretically). And that is very bad… :-\

By the way… do you got a sample code to convert this HSV to RGB?

Let the GPU do everything.

Your fragment program would look (something) like this:


uniform sampler2D grayscaleTexture;

vec3 convertHsvToRgb(float H, float S, float V)
{
  int Hi = (H / 60) % 6;  // this may need to use the float mod() like this, can't remember: mod((H / 60.0), 6.0);
  float f = H / 60.0 - Hi;
  float p = V * (1.0 - S);
  float q = V * (1.0 - (f * S));
  float t = V * (1.0 - ((1.0 - f) * S));
  
  if (Hi == 0)
    return vec3(V, t, p);
  if (Hi == 1)
   return vec3(q, V, p);

  etc...
}

void main()
{
  vec4 grayscale = texture2D(grayscaleTexture, gl_TexCoord[0].st);

  //-- since the incoming grayscale r == b == g, just use r as the intensity

  float G = grayscale.r;
  float H = G * 360.0;
  vec3 rgb = convertHsvToRgb(H, 1.0, 1.0);
  
  now just set the output color to your converted color

  gl_FragColor = vec4(rgb, 0.0);
}

I didn’t compile or test this code, but the conversion is directly from that URL.

One thing to watch out for is how the H wraps back around. H=0 is red, then up through orange, yellow, green, blue, magenta. When H=360 you are back to red. You probably want to map your range into [0…300] instead of [0…360] so your hottest colors are magenta, not red. Just multiply your H by 5/6 before passing it into the conversion.

This will be a great use of your idle GPU, especially if your CPU is already busy.

Why the complex convertHsvToRgb() function? As your incomming greyscale is already in the range [0, 1], just use the intensity as the texture coord into a 1d texture with your colour/temperature gradient.

  • No complex maths in the fragment shader (and certainly no nasty conditionals)
  • Data driven, just switch the 1d texture for an entirely different colour mapping

Your fragment shader would be something like:


uniform sampler2D greyscale;
uniform sampler1D colourMapping;

void main()
{
  float intensity = texture2D(greyscale, texCoords).r;
  vec3 temperatureColour = texture1D(colourMapping, intensity);
  gl_FragColor = temperatureColour;
}

For personal reasons, I don’t use CG scripts. So about the code above, I am not able to use it. Or maybe, the same method could be done by OpenGL options. If there is a way to do that, please tell me how and maybe a sample code.

That isn’t CG, it’s GLSL, which can be used in OpenGL.

BTW disregard the pixelmap thing. After some further investigation it turns out this is a pretty badly supported part of OpenGL. Apparently most implementations don’t accelerate it or have buggy implementations…

And how do I suppose to use this GLSL. I am very infamilier with these Computer Graphics things. Is GLSL not a script which you have to load it with a OpenGL-function via JOGL???

http://www.lighthouse3d.com/opengl/glsl/

Sorry for this stupid question, but I just don’t get it. How do I suppose to use GLSL with JOGL. I read the Lighthouse-stuff. But those are C code, which I tried to rewrite it in Java, but some function don’t seems to work. For instance… what is GLhandleARB anyway. JOGL don’t know this thing, etc… need some help plz. All I need is an example of java to use this GLSL

[quote=“bahamut,post:12,topic:24846”]
Just an int in java.

GLSL is one of the advanced features of OpenGL. Try to master OpenGL first, otherwise you’ll get stuck sooner or later…

There are plenty of tutorials about GLSL, and IIRC nehe has a lession about it too. Porting to Java is normally 1:1, the weird GL-primitives are normally ints. Otherwise just check the javadocs to see what primitives the methods return.

If you just want a quick solution, it might be better to hire somebody, as it seems to be a professional project, isn’t it?

void setShaders() {
	v = glCreateShaderObjectARB(GL_VERTEX_SHADER_ARB);
	f = glCreateShaderObjectARB(GL_FRAGMENT_SHADER_ARB);	
	
	vs = textFileRead("toon.vert");
	fs = textFileRead("toon.frag");
	
	const char * vv = vs;
	const char * ff = fs;
	
	glShaderSourceARB(v, 1, &vv,NULL);
	glShaderSourceARB(f, 1, &ff,NULL);
	
	free(vs);free(fs);
	
	glCompileShaderARB(v);
	glCompileShaderARB(f);

	p = glCreateProgramObjectARB();
		
	glAttachObjectARB(p,v);
	glAttachObjectARB(p,f);
	
	glLinkProgramARB(p);
	glUseProgramObjectARB(p);

translates to

public void setShaders(GL gl) {
	// Create the shader objects for the vertex and fragment shader
	int v = gl.glCreateShaderObjectARB(GL.GL_VERTEX_SHADER_ARB);
	int f = gl.glCreateShaderObjectARB(GL.GL_FRAGMENT_SHADER_ARB);	
	
	// Read the GLSL code from file
	String vs = textFileRead("toon.vert");
	String fs = textFileRead("toon.frag");

	// Pass the source to OpenGL	
	gl.glShaderSourceARB(v, 1, new String[] {vv}, null);
	gl.glShaderSourceARB(f, 1, new String[] {ff}, null);
	
	// Compile the two shaders
	gl.glCompileShaderARB(v);
	gl.glCompileShaderARB(f);

	// Create the program object
	int p = gl.glCreateProgramObjectARB();

	// Attach the two shaders to the program object		
	gl.glAttachObjectARB(p,v);
	gl.glAttachObjectARB(p,f);
	
	// Link the program
	gl.glLinkProgramARB(p);

	// And set it as the current program
	gl.glUseProgramObjectARB(p);
}

public String textFileRead(String aFileName) {
	// Read file to String
}

I haven’t had the opportunity to play with shaders yet, so I’m not sure this works :slight_smile:
I’m not sure how jogl handles the Strings, so you might have to pass the length to glShaderSourceARB for things to work properly.

Ok ok, and the feedback-handler
(LWJGL code but easy to port to JOGL)


...
glCompileShaderARB(f);
checkLogInfo(f, "Fragment Shader compilation: ", true);

...
glCompileShaderARB(v);
checkLogInfo(v, "Vertex Shader compilation: ", true);

...
glValidateProgramARB(p);
checkLogInfo(program, "Shader Program validation: ", false);


   private final void checkLogInfo(int obj, String tag, boolean throwException)
   {
      IntBuffer iVal = BufferUtils.createIntBuffer(1);
      glGetObjectParameterARB(obj, GL_OBJECT_INFO_LOG_LENGTH_ARB, iVal);

      int length = iVal.get();

      if (length <= 1)
      {
         return;
      }
      
      ByteBuffer infoLog = BufferUtils.createByteBuffer(length);

      iVal.flip();
      glGetInfoLogARB(obj, iVal, infoLog);

      Util.checkGLError();

      byte[] infoBytes = new byte[length];
      infoLog.get(infoBytes);
      String out = new String(infoBytes);

      String msg = tag + out.substring(0, out.length() - 1);

      if (throwException)
      {
         throw new ShaderException(msg);
      }

      System.out.println("GLSL Validation >> " + msg);
   }



   private final ByteBuffer toByteString(String str, boolean isNullTerminated)
   {
      int length = str.length();

      if (isNullTerminated)
         length++;

      ByteBuffer buff = BufferUtils.createByteBuffer(length);
      buff.put(str.getBytes());

      if (isNullTerminated)
         buff.put((byte) 0);

      buff.flip();

      return buff;
   }

Tanks all, I hope this works…

Ehh…I did this now…

int v = gl.glCreateShaderObjectARB(GL.GL_VERTEX_SHADER_ARB);
String[] vs = { loadFile("Shaders.vert") };
gl.glShaderSourceARB(v, 1, vs, null);

Eclipse say that the method glCreateShaderObjectARB is amibiguous for the type GL. Where is it going wrong this time?

Well that is just a standard Java exception when you a method is overloaded and the parameters you send to it can be interpretated in different ways.

Like:
Holder.setValue(8, null);

With:
class Holder
{
public static void setValue(int x, List list) {}
public static void setValue(int x, Map map) {}
}

ah, yes… offcourse. Got it, I fixed it… Thanks…

But afther this, I still got the following exception

Exception in thread "AWT-EventQueue-0" net.java.games.jogl.GLException: Method "glCreateShaderObjectARB" not available
	at net.java.games.jogl.impl.windows.WindowsGLImpl.glCreateShaderObjectARB(WindowsGLImpl.java:5779)
	at src.SecondGLEventListener.setShaders(SecondGLEventListener.java:71)
	at src.SecondGLEventListener.display(SecondGLEventListener.java:49)
	at net.java.games.jogl.impl.GLDrawableHelper.display(GLDrawableHelper.java:74)
	at net.java.games.jogl.GLCanvas$DisplayAction.run(GLCanvas.java:249)
	at net.java.games.jogl.impl.GLContext.invokeGL(GLContext.java:294)
	at net.java.games.jogl.impl.windows.WindowsOnscreenGLContext.invokeGL(WindowsOnscreenGLContext.java:79)
	at net.java.games.jogl.GLCanvas.maybeDoSingleThreadedWorkaround(GLCanvas.java:236)
	at net.java.games.jogl.GLCanvas.display(GLCanvas.java:77)
	at net.java.games.jogl.GLCanvas.paint(GLCanvas.java:86)
	at sun.awt.RepaintArea.paintComponent(Unknown Source)
	at sun.awt.RepaintArea.paint(Unknown Source)
	at sun.awt.windows.WComponentPeer.handleEvent(Unknown Source)
	at java.awt.Component.dispatchEventImpl(Unknown Source)
	at java.awt.Component.dispatchEvent(Unknown Source)
	at java.awt.EventQueue.dispatchEvent(Unknown Source)
	at java.awt.EventDispatchThread.pumpOneEventForHierarchy(Unknown Source)
	at java.awt.EventDispatchThread.pumpEventsForHierarchy(Unknown Source)
	at java.awt.EventDispatchThread.pumpEvents(Unknown Source)
	at java.awt.EventDispatchThread.pumpEvents(Unknown Source)
	at java.awt.EventDispatchThread.run(Unknown Source)

Method “glCreateShaderObjectARB” not available??? Wrong Version of JOGL or something like that? By the way… I don’t use Vertex anyway… so I deleted those

You driver or your card doesn’t support shader programs, it seems.

Funny, so there is nothing wrong with my JOGL then… anyway… what kind of videocard do I need? I got a NVidia GeForce FX5200. Would that be a problem???