Bits of Subpixel Precision?

A bit of an OpenGL question for you - how many bits of subpixel precision do your OpenGL drivers provide?

My 8MB S2 Savage/IX seems to only use 3 bits - less than the 4 bits required by the OpenGL spec. Is this practice common? What can I do about it? Does anyone know of any way I can prove that it’s only 3 bits, rather than just accepting the queried state as fact?

(N.B. Just so people don’t get the wrong end of the stick, this is not LWJGL’s fault! It can only provide what the drivers allow.)

Best asked at the OpenGL forums at www.opengl.org

Cas :slight_smile:

True, I’ve posted the question there as well.

What are your thoughts on the matter - don’t you expect your OpenGL drivers to provide at least the minimum that the spec requires? Seems a bit odd to me. Maybe I’ve got a different definition of “required” than the ARB! ;D

I have no idea about subpixel precision whatsoever to be honest, as I’ve never come across any issues with it. Everything just lines up perfectly.

Cas :slight_smile: