Radeon 9550

I have a test machine with a Radeon 9550 (AGP) running Windows XP Home Basic. The game I am developing is running in full-screen exclusive mode.

I am using a very large single VolatileImage for the game’s main map. I noticed that with this one test machine that I getting garbage drawn in the image when the VolatileImage is very large (about 3000 x 3000). The garbage is the outer portions of each side. This seems to be a problem of the VolatileImage’s large dimensions.

I manually broke the image down into an array of VolatileImage objects that are each 512 x 512. This solved the problem. But my question is: should this be happening at all? When I use a smaller map (about 2400 x 2400) this doesn’t happen.

I haven’t noticed any appreciable difference in performance with the approach of using my own subimages. So I guess that it’s a good workaround if nothing else. But I am curious: is this behavior expected? Or is it another driver problem? I am using the most recent Catalyst 10.2 driver. It so happens that the motherboard supports AGP 4x but not 8x.

And I note that the driver for this model is now on legacy status at ATI.

Thanks

James

I would imagine that the 9550’s maximum framebuffer size is smaller than 3000x3000. I can dimly remember similar problems on the 9600 - when extending the desktop onto a second screen, OpenGL windows would just stop rendering when placed at the far right edge.

The maximum texture size is 2048 * 2048 on this graphics card as far as I know. It is not a driver bug, only a limitation of the hardware. If you used OpenGL rather than plain Java2D, you could even get this value. 512 * 512 will work on many graphics cards.

I have a AGP 8x card too , called Radeon 3650HD , looked up texture size limit: “GL_MAX_TEXTURE_SIZE: 8192”. Can even play crysis good on low settings ;D

I’d consider this a bug in Java; the graphics api provides no means of knowing these limitations, it’s meant to be higher level than that.
Consequently such driver limitations should be handled transparently by the Java2D abstraction.
If the driver is misreporting the maximum texture size, it becomes understandable why Java2D is appearing to be bugged - but still the responsibility of the Java2D abstraction to account for such issues.

Trouble is - how’s Java supposed to know when the driver lies?
Answer - it’s not - get newer video drivers!

Cas :slight_smile:

First off, thanks for all the responses. :slight_smile:

In case my original post was not clear, I am using the D3D pipeline.

I thought that BufferedImage was implemented as a texture (when it’s accelerated), but not VolatileImage? Because of render-to-texture issues?

I agree with Abuse but I think that Cas has given the most practical solution. The problem is that the Catalyst 10.2 driver is the last standard-release driver for this “legacy” card. I don’t suppose that ATI will issuue a Hotfix for me. Maybe I should try? :wink:

Well, I have a work-around and perhaps it’s best to use it all the time to be safe.

James

Edit: I meant that Abuse is right in the sense that it would be nice if Java took care of problems like this. But Cas is right to note that Java is dependent on accurate information from the driver.