Problem with User Defined Clipping Planes on ATI cards

Hi all!

I am using four user defined clipping planes in orthographic projection mode to “cut out” a rectangle. I programmed and tested that on a GeForce card and everything was just fine. Now I got a ATI card (ATI Mobility Radeon X300) and sometimes my own defined clipping planes are off one pixel! The position of a clip plane is apparently not precisely computed (unless I make a mistake). Interestingly, the problem appears differently at different positions in my ortographic 2D space. You may say “only one pixel, so what!?” but it does matter in my case. I attached a screenshot to illustrate the problem.

I tested it on several other machines with various GeForce models and the problem does not occur there. However on all ATI machines I tried, the problem appears…

Any ideas? Any hint is appreciated! Although I haven’t described the problem in detail, one of you maybe encountered the same problem and can tell me my faux-pas.

Thanks

Johannes

PS: I will try next to reproduce the problem in an isolated mini program…

For pixel-perfect rendering, you should use the advice here. Specifically,

[quote]If exact two-dimensional rasterization is desired, you must carefully specify both the orthographic projection and the vertices of primitives that are to be rasterized. The orthographic projection should be specified with integer coordinates, as shown in the following example:

gluOrtho2D(0, width, 0, height);

where width and height are the dimensions of the viewport. Given this projection matrix, polygon vertices and pixel image positions should be placed at integer coordinates to rasterize predictably. For example, glRecti(0, 0, 1, 1) reliably fills the lower left pixel of the viewport, and glRasterPos2i(0, 0) reliably positions an unzoomed image at the lower left of the viewport. Point vertices, line vertices, and bitmap positions should be placed at half-integer locations, however. For example, a line drawn from (x1, 0.5) to (x2, 0.5) will be reliably rendered along the bottom row of pixels int the viewport, and a point drawn at (0.5, 0.5) will reliably fill the same pixel as glRecti(0, 0, 1, 1).

An optimum compromise that allows all primitives to be specified at integer positions, while still ensuring predictable rasterization, is to translate x and y by 0.375, as shown in the following code fragment. Such a translation keeps polygon and pixel image edges safely away from the centers of pixels, while moving line vertices close enough to the pixel centers.

glViewport(0, 0, width, height);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluOrtho2D(0, width, 0, height);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glTranslatef(0.375, 0.375, 0.0);
/* render all primitives at integer positions */
[/quote]

You’re using clipping planes for rectangular cutouts?
Use the scissor test instead! Much more reliable (takes actual int pixel coordinates) and faster (is actually hardware accelerated on all geforce cards, unlike user clip planes).

glScissor(x, y, width, height);

Of course, as usual, y=0 is the lower edge of the screen in opengl.

ahh! Thanks!

I used indeed glOrtho instead of gluOrtho2D(). I guess I haven’t read the red book up to this appendix :-[

[quote]You’re using clipping planes for rectangular cutouts?
[/quote]
Yeah, in fact we used glScissor a long time ago. Then I thought using clip planes would be a smart idea because this enables us to clip Widgets even when they are not drawn in plain 2D ortographic mode. But since this will never be the case probably, I think we will fall back to the scissor variant for clipping…

Thanks again for your help! I keep you posted.

Johannes

I switched back to glScissor and now it works as expected! But from what I can tell the performance increase is only marginal (actually not measurable). Probably because my running examples are rather small.

Thanks again!

Johannes