Fast and effecient way of scaling JPEG images?

I haven’t done any tests, but does anyone know how well java performs scaling like 2048x2048 (4-8mb) JPEG images down to like 640x480 and compress it down to 300-400 kb?

I’m thinking of an server application that would perform this operation 24/7, but I can’t let it eat up all the memory or take many seconds for each image.

Here’s what I need to do:

  1. Open big image.
  2. Scale down and compress.
  3. Save smaller image to a new file on disk.

The reason I’m asking this is because I read some time ago that the java vm has limited memory regarding this, and I can’t load huge jpeg images into memory?

And how fast is your testing application, after warmup period?

2kB^2 = 4 MB * 3 = 12 MB isn’t too much.

Are you reusing BufferedImages?

For heavy image processing one of my colleagues uses JAI. I don’t have any numbers, but I would expect it to outperform BufferedImages.

Thanks, I’ll take a look at JAI.

Just found a post on JAI performance. It doesn’t look good, so I take back my original post :-[

And answer to my questions is…

BTW these are somewhat strange numbers. 4 MPi Jpegs with 4-8 MB size. Either they are lossles, or with insanely low quantitization. In fact lossless compression with BZX might be better nearly always. And BZX is general purpose data analisys program that’s able to shrink files with some regularity.
640 x 480 *3 ~= 800 kB uncompressed. In fact lossy codec as JPEG should be able to drop theirs size to 58-98 kB without any effort.

So few other questions.
What is speed of loading the image into memory?
What is speed of the disk?
Are you using Bsplines, or Lancos filter for downsampling?
What is speed of memory of the server?
What is network latency, and are you doing them in parallel?

You’ll get problems dealing with 4mb jpegs.
4mb doesn’t sound much, but as jpeg it’s compressed very much, so uncompressed it might be above 50mb and then the default java memory is not enough.
But if you think this is a serious problem, simply start java with java -mx200m.

Try using the imageio package (javax.imageio). This allows you to read an image and reduce its size in a single step. Depending on the implementation this may not require as much memory as would be required to read in the entire original image. The downside is that the quality of rescaling is likely yo be reduced. Alternatively it also allows you to read in parts of an image, so you could divide the larger image into many pieces and rescale each separately (you will need some overlap) and finally recombine the reduced pieces.