glTexImage2D problem (SOLVED)

I’ve come across a problem when trying to use the low level texture calls instead of the com.sun.opengl.util.texture.Texture class. I’m using a floating point texture (which is sent to a shader to apply a color based on it’s value, but that’s not the important part), and I have a float[] called “data” which I’ve fill with pertinent data. If I do this:


TextureData texData = new TextureData(GL_LUMINANCE, width, height, 0, GL_LUMINANCE, GL_FLOAT, false, false, true, FloatBuffer.wrap(data), null);
tex[MYTEXTURE] = TextureIO.newTexture(overviewTextureData).getTextureObject();

(where “tex” is a previously allocated int[], and width and height are ints such that data.length == width*height) and then i use the “tex[MYTEXTURE]” to bind at rendering it works perfectly. However, if I replace the above with this:


gl.glGenTextures(1, tex, MYTEXTURE);
gl.glBindTexture(GL_TEXTURE_2D, tex[MYTEXTURE]);
gl.glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, width, height, 0, GL_LUMINANCE, GL_FLOAT, FloatBuffer.wrap(data));

then it seams that the texture data wasn’t updated, and it looks as if all the values are 0.
What could I be missing here?

Edit: I’m using hardware (ATI MR 4570) and software capable of non-power-of-two textures, and width and height are non-power-of-two.

Edit: Solved it. Forgot to set the filters, adding the following fixes the second set of code:


gl.glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
gl.glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);

(didn’t need linear filtering since the texture map is pixel perfect)