A bit of an OpenGL question for you - how many bits of subpixel precision do your OpenGL drivers provide?
My 8MB S2 Savage/IX seems to only use 3 bits - less than the 4 bits required by the OpenGL spec. Is this practice common? What can I do about it? Does anyone know of any way I can prove that it’s only 3 bits, rather than just accepting the queried state as fact?
(N.B. Just so people don’t get the wrong end of the stick, this is not LWJGL’s fault! It can only provide what the drivers allow.)
