I’ve got another set of questions that you can help to answer - this is regarding BufferedImages in particular:
-
How are the pixel samples organized in a DataBuffer that was retrieved from a BufferedImage object created using the constructor BufferedImage(int width, int height, int type)? For example, how are the pixels organized for a BufferedImage created with TYPE_INT_ARGB? If I wanted to access the pixel at the 5th row and 10th column in a rectangle of width x height pixels, is it simply like this:
pixelAt[4 * width + 9] ? -
Are there documents that describe what implementations of ColorModel, SampleModel and DataBuffer objects are automatically created for the BufferedImage object when a particular type (ie, BufferedImage.TYPE_INT_ARGB) is passed in to the constructor?
-
If I want to create a BufferedImage with only BITMASK transparency using the constructor BufferedImage(int width, int height, int type), what should I pass in as the type? If I should use TYPE_INT_ARGB to simulate BITMASK transparency, would that be considered as a full-alpha image (like images created with Transparency.TRANSLUCENT) in terms of performance during rendering?
-
Is getting the data array from the DataBuffer of a BufferedImage the quickest way to manipulate and render bitmaps using Java 1.4.2?
Also, I forgot to thank Abuse, Onyx and Trembovetski for helping me out with answers for my last post. So, here you go guys - thanks.