Best O.S. / Hard Drive Size for NIO Server load testing...

Hi

I am currently load testing a NIO Server to see what would be the maximum number of Client connections that could reached, using the following PC: P4, 3GHz, 1GB RAM, Windows XP, SP2, 110 GB Hard Drive.

However, I would like to test the Server/Client performance on different OS’s, e.g. Win Server 2000, Linux, Solaris.

Which would be the best possible option from the following:

  1. Partition my current drive, (using e.g. Partition Magic), to e.g.
  • Win XP: 80 GB
  • Win Server 2000: 10 GB
  • Linux: 10 GB
  • Solaris: 5 GB
  • Shared/Data: 5 GB
  1. Install a separate Hard drive with the different hard drives
  2. Use a disk caddie, to swap in/out test hard drives.
  3. Any thing else?
  • Would the Operating System’s hard drive size affect the Server/Client’s performance, e.g. affecting the number of connections, number of File Handles, the virtual memory, etc.?
  • Can anyone recommend an OS for Server testing?

Many Thanks,
Matt :wink:

We had very good experiences with Linux-2.6, although we never tested on Solaris.
With Linux-2.4 we sometimes hit the scaliability limit but now, everything is as smooth as butter
You maybe have to set the maximum allowed filehandles up, which (on fedora) somewhere at 50700 by default.

lg Clemens

Id defeintiely recommend SolarisX86 10.

But then again I work for Sun so myabe my POV isnt worth that much :wink: