After doing your transparency sorting, when dropping in the objects for transparency rendering, continue to apply the basic state sorting to it. Large outdoor scenes that have hundreds of the same object (eg tree) don’t need to go into texture state changes most of the time. We’d turned state sorting off completely for the transparent objects (as is recommended by every text that you read about it). Using a partial sort test after the depth sorting made a huge difference.
Edit: Doh… button, wanted preview not send…
Anyway, what I was saying… We run a lot of the Planet9 data on our applications for various things. In the Elumens dome mode (which does 3 or 4 render passes per final output pass) we jumped from averaging 20 FPS to 33FPS just through this simple change. In non-dome mode we’ve jumped from 65FPS to 100+FPS. These are the really big outdoor scenes like Washington DC with the 2Kx2K textures etc. Smaller stuff like the various downtown locations (San Diego, San Fran etc) got less of a speed boost because there’s less transparent objects. In the D.C test, about 50% of the objects have some form of transparency.

One version of a driver to the next, or one vendor to the next, or one OS to the next… the number of permutations is horrendous. So instead I have to apply a little “lowest common denominator” logic to it (eg. what’s the worst case? Uploading 12mb of textures? etc.) You can also wrap your states up clientside with a boolean guard and attempt to track them and ignore dud state changesthough this can be fraught with difficulty if you don’t have very thorough control of what’s going on and I suspect many of the later drivers might already optimize this case.
But taken into account what seems to be the very nature of computer graphics - faking images with a lot of tricks, so that you have the illusion of real physics - it is… well… natural to tailor your application to the one special case so that you can get the best out of it.