I’m having an issue with non-power-of-two textures on ATI cards that I’m hoping someone can help me find a software fix for.
Using an image that is 21x20 pixels, getImageTexCoords() is returning [0.0, 0.0, 21.0, 20.0]. This is using an updated ATI driver (from 12/4/2007, version 8.442.0.0), and has been replicated on two different machines, one with an X1400 and one with a Radeon XPress. Before the driver was updated, the image was displaying and these values were being returned correctly; bounded between 0 and 1 (and works correctly on all NVidia cards).
In addition, the Texture object is returning 21 and 20 as the results to calls to getWidth() and getHeight(), respectively. If this was returning the correct texture size, I could just create the texture coordinates myself, but I don’t have access to the actual created texture size, so I couldn’t do this correctly for all platforms.
As one last piece of data, if I hack in the correct texture coordinates, it seems that the wrong underlying image data is getting used for this texture, and a different image is actually getting displayed. The target object stored in the texture is different though, and writing the image being used out to disk does result in the correct image, so I don’t know how that would be happening.
We don’t have the option of asking our users to install a different driver so I need to determine a fix that I can deploy to our applet.
We’re using Jogl 1.1.1 rc6.
Many thanks,
Dale Beermann