Fragment programs and gl{Draw,Copy}Pixels

Hi,

I am trying to use a fragment program with the glDrawPixels and glCopyPixels calls, in JOGL. I have no problem getting my fragment programs to be invoked when rendering polygons, but they don’t seem to run when I use glDrawPixels or glCopyPixels. (Really, I’m only interested in the latter, in order to feed color data from the framebuffer back into my fragment program.)

I am wondering which of the following is the case:

  1. My interpretation of the GL spec (that DrawPixels and CopyPixels feed fragments back through the fragment program) is incorrect. I shouldn’t expect this to happen.

  2. The GL drivers on my machine are broken somehow.

  3. This is a JOGL problem, and has to do with some fancy internal management of the framebuffer, or some such.

If the answer is 1 or 2, I apologize in advance for the off topic post.

Below is an excerpt of the code I am using. I am using the DebugGL debugging wrapper, and no exceptions are being raised during the execution of my program.

TIA,
Kevin

– code excerpt –

gl.glClear(GL.GL_COLOR_BUFFER_BIT);

String prog = … long fragment program …

gl.glProgramStringARB(
GL.GL_FRAGMENT_PROGRAM_ARB,
GL.GL_PROGRAM_FORMAT_ASCII_ARB,
prog.length(),
prog
);

gl.glEnable(GL.GL_FRAGMENT_PROGRAM_ARB);

// generate some random pixels
Random rand = new Random();
ByteBuffer data = ByteBuffer.allocate(64*64);
for (int i = 0; i < 64; ++i) {
for (int j = 0; j < 64; ++j) {
data.put((byte)rand.nextInt());
}
}
data.flip();

gl.glRasterPos2i(500, 200);

// here, pixels are in fact copied out of my ByteBuffer,
// but without passing through the fragment program
gl.glDrawPixels(64, 64, GL.GL_LUMINANCE, GL.GL_UNSIGNED_BYTE, data);

// and here, pixels are in fact being read from the frame buffer,
// but without passing through the fragment program
gl.glCopyPixels(500, 500, 100, 100, GL.GL_COLOR);

// here, the pixels of the rectangle do pass through the fragment program
// (as can be verified by messing with it to set the color of the fragments to red, or whatever)
gl.glRecti(100, 100, 200, 200);

– info strings for my crappy laptop’s onboard GPU –
– generated during execution of the above program –

INIT GL IS: javax.media.opengl.DebugGL
GL_VENDOR: Intel
GL_RENDERER: Intel 915GM
GL_VERSION: 1.4.0 - Build 4.14.10.3984

glDrawPixels and glCopyPixels do not generate fragments. I think the documentation in the OpenGL Red Book is somewhat misleading which may be the source of your confusion. The detailed descriptions of glDrawPixels and glReadPixels later in the chapter make it more clear that fragment processing is not occurring for these rectangular pixel copies.

Ah, thanks, Ken. I suppose I should stop being a cheap bastard, and buy the red book, rather than trolling through SGI’s web site. I was misled by the comment

(17) Should fragment programs affect all fragments, or just those
produced by the rasterization of points, lines, and triangles?

  RESOLVED: Every fragment generated by the GL is subject to 
  fragment program mode.  This includes point, line, and polygon 
  primitives as well as pixel rectangles and bitmaps.

in the fragment program extension spec at

http://oss.sgi.com/projects/ogl-sample/registry/ARB/fragment_program.txt

So far, aside from my GL incompetence, my experience with JOGL has been pain-free, by the way. I’m so happy I can do this in Java instead of C.

Kevin

You might want to bring this up on the forums on opengl.org. This is probably an area of the spec that is incorrect, at least to the best of my knowledge. It’s also possible that Intel’s drivers are simply wrong, so you should probably test at least on NVidia hardware first.

From what I can tell, drawPixels should generate fragments. The man page for drawPixels for instance says

[quote]The GL then converts the resulting RGBA colors to fragments by attaching the current raster position z coordinate and texture coordinates to each pixel.

These pixel fragments are then treated just like the fragments generated by rasterizing points, lines, or polygons. Texture mapping, fog, and all the fragment operations are applied before the fragments are written to the frame buffer.
[/quote]
And from looking at the OpenGL state machine drawing (from OpenGL 1.1 thought) it looks as if rasterization fragements and draw pixel fragments should follow the same path.

Also reading through the OpenGL 2.0 spec seems to indicate that fragments should be generated.

I asked this question on the “beginners” forum at opengl.org:

http://www.opengl.org/discussion_boards/cgi_directory/ultimatebb.cgi?ubb=get_topic;f=2;t=019680

I am leaning toward the view that this is an issue with my hardware/drivers, but unfortunately I don’t have alternative hardware to play with at the moment. In any case, I am reassured that this is not a problem with jogl.

I suppose if broken implementations like this Intel one are common, defensive programming would require testing each pipeline in my software by rendering something and reading back the pixels to see whether the correct thing happened. What a pain.

On most consumer chipsets exotic and less commonly used features are often neglected, not implemented correctly or implemented in software. From what I’ve read it seems that consumer level hardware is really optimized for rendering textured polygons, so I would agree with relic’s post at opengl.org and go for the texture mapped quad approach.