Really stupid problem

i have displayed a rgb histogram of my byte image .
i m trying to do linear stretching by using a input max and min values from the histogram as the user asks and output min, max as 0 & 255. Then m loading a color table…the default color table works fine…i m using byte buffer for the color tables…

However i m not getting the desired output
When i m printing the color table it prints -ve values as the java interprets byte as -127 to 127
moreover only changes in the blue histogram are reflected
it seems dat instead of enhancement the image is deteorating…

what can be wrong…is it something to do wid de byte stuff…or need to check my logic again???

It’s the byte stuff. I had the same sort of problem. I ended up using char’s to hold the data for images that were 16bpp and long’s for 32bpp then casting them to int’s to put it back into the buffer so that Java wouldn’t misinterperate the signed vs. unsigned stuff.

hey

but i dont understand…
my table is of bytebuffer…and i m loading normal table (values 0-255)…and it works well…

whatever the value in the byte buffer, when i m using gl_unsigned_byte isnt jogl supposed to interpret it as unsigned??

When i m inverting only red or green values nothing happens…but something happens when i invert de blue ones…

It’s hard to tell what’s the problem without seeing the code with description, what portion of code is expected to do something and how the real result differ from the expectation. Try to narrow down a code example with comments and post it here.

Don’t know about the red and green stuf, but I think your problem might be trying to modify an unsigned byte in java, which is not possible, since java has no knowledge of the unsigned byte datatype. If you want to modify an unsignes byte in java you have to convert it to an int first:


int unsignedValue= ((int)byteValue)&255;

note the “&255”, which is neccessary to cut the extended two’s complement sign flags from the resulting integer to preserve the real unsigned value of the byte. To convert it back a simple cast is sufficient:


byte byteValue=(byte)unsignedValue;

you will note the byteValue to be signed, if you print it out, but this is nonrelevant, since OpenGL will interpret the binary content as unsigned byte.

This will work just as well, and it’s one (manual) cast less:

int unsignedValue = byteValue&255; 

Since 255 is an int literal, the result is an int as well.

it seems dat i have sorted it out…Actually i had created a class called LUT wherein new lookuptables were created.
i was den creating 3 objects of dis class for r,g,b and then clubbing them into 1 lookuptable rgb…
so whichever last object was created showed some changes…

so i changed de class and created a single lut and then it worked…

why is dat so because finally i was loading de same lut