RenderBin Exception

I’m getting an exception when trying to push a relatively large quantity of data to the Xith renderer. The error is:


Exception in thread "main" java.lang.NullPointerException
      at com.xith3d.render.RenderBin.addAtom(RenderBin.java:87)
      at com.xith3d.render.Renderer.addAtom(Renderer.java:163)
      at com.xith3d.scenegraph.View.renderNode(View.java:1088)
      at com.xith3d.scenegraph.View.renderNode(View.java:1015)
      at com.xith3d.scenegraph.View.renderNode(View.java:968)
      at com.xith3d.scenegraph.View.renderNode(View.java:1015)
      at com.xith3d.scenegraph.View.renderNode(View.java:1015)
      at com.xith3d.scenegraph.View.renderNode(View.java:1015)
      at com.xith3d.scenegraph.View.renderNode(View.java:1015)
      at com.xith3d.scenegraph.View.renderNode(View.java:1015)
      at com.xith3d.scenegraph.View.renderNode(View.java:1015)
      at com.xith3d.scenegraph.View.renderNode(View.java:1015)
      at com.xith3d.scenegraph.View.getRenderFrame(View.java:824)
      at com.xith3d.scenegraph.View.renderOnce(View.java:717)
      at com.xith3d.scenegraph.View.renderOnce(View.java:655)
      at com.janusresearch.commsplanner.ui.openGL.RenderedCanvas.start(Unknown Source)
      at com.janusresearch.commsplanner.ui.CommsPlannerGUI.init(Unknown Source)
      at com.janusresearch.commsplanner.ui.CommsPlannerGUI.<init>(Unknown Source)
      at com.janusresearch.commsplanner.ui.CommsPlannerGUI.main(Unknown Source)

What I see in RenderBin.java is:


      final static int START_SIZE = 3000;
      final static int EXT_SIZE = 200;

and


      public RenderBin(StatePriorities priorities) {
            buckets = new RenderBucket[START_SIZE];
            for (int i = 0; i < START_SIZE; i++)
                  buckets[i] = new RenderBucket();
            maxSize = START_SIZE;
            curSize = 0;
            this.priorities = priorities;
      }

      public void addAtom(RenderAtom atom) {
            if (curSize == buckets.length) {
                  RenderBucket[] newBuckets =
                        new RenderBucket[buckets.length + EXT_SIZE];
                  System.arraycopy(buckets, 0, newBuckets, 0, buckets.length);
                  buckets = newBuckets;
            }
            buckets[curSize++].setAtom(atom);
      }

What I’m wondering is, why the limits on the bucket? Who says someone won’t have massive hardware capable of rendering many more atoms? Any thoughts?

Would you like me to add an Option to RenderOptions to let one specify these limits?

Would using a LinkedList have a lot of overhead in this case? I wouldn’t think you’d get much garbage, nor much of a slow down. In fact, you may have less memory wastage as you only use what you need.

Will.

Hi,

[quote]What I’m wondering is, why the limits on the bucket?
[/quote]
There should be no limit, buckets array should grow if it is needed.

[quote]Who says someone won’t have massive hardware capable of rendering many more atoms?
[/quote]
I guess nobody says that

[quote]Any thoughts?
[/quote]
Sure. This looks like a bug. Your NPE has nothing to do with the limits. You see, after you get to the array growing point, the old buckets are copied to the new array, but new buckets are not preallocated at that moment. We should fix this.

[quote]Would you like me to add an Option to RenderOptions to let one specify these limits?
[/quote]
No, there should be no need for that.

[quote]Would using a LinkedList have a lot of overhead in this case?
[/quote]
Yes. Iterating over array is much faster, I believe.

Yuri