Mine is 1.0-SNAPSHOT as it appears in the pom file.
In fact I’m not describing a wrong behavoir, but some code I think is no correct, or at least I can’t understand.
I will try to make it apparent in an example, I don’t know this will be easy.
Ardor3D is mainly for the fixed pileline case. It includes a few examples with shaders, but they are more an exception than a rule. I feel the current framework is not well suited for a general feature like “world clipping in the GPU”, that would mean using shaders by default.
What I was trying to say is that the Automatic culling mode -in case I’m wrong and it works as promised- might not suppose a great deal in most common cases. It would be noticiable only if you have a huge model and you are zooming on a very small region of it, resulting in many vertices not being sent to the card.
But I’m not saying that frustum clipping is a big burden for the CPU, as it only involves checking the bounding region of each mesh against the 6 planes of the frustum. Whatever the complexity of the mesh, only its boundary is checked, so the test should no be a major problem for the cpu. Only if your scene has a zillion of micro meshes moving around you should be worried about it, and try to clip in the gpu, or simply no clip.
Thanks for pointing to me to your maintained version, I will use it in the future.