I’m writing av Java3D application that will display a stereo view of a virtual world.
The left eye view will be displayed in one JFrame and the right eye view in another JFrame.
The executing PC will have two physical screens (or projectors really) and one JFrame will be displayed,
maximized and undecorated, on each screen.
The image from the two projectors will be polarized differently an then overlaid. The user will look at this using polarized glasses. No headtracking or more complicated stuff.
Extract from the code:
…
Canvas3D canvas3D_A = new Canvas3D(gc);
canvas3D_A.setMonoscopicViewPolicy(View.LEFT_EYE_VIEW); // <–
Canvas3D canvas3D_B = new Canvas3D(gc);
canvas3D_B.setMonoscopicViewPolicy(View.RIGHT_EYE_VIEW); // <–
…
View view = new View();
view.addCanvas3D(canvas3D_A);
view.addCanvas3D(canvas3D_B);
…
ViewPlatform viewPlatform = new ViewPlatform();
view.attachViewPlatform(viewPlatform);
…
<<<
Is this the right/best approach?
Which View Attach Policy:
View.NOMINAL_HEAD (default) or View.NOMINAL_SCREEN?
Which Window eyepoint policy (View object):
RELATIVE_TO_WINDOW or RELATIVE_TO_FIELD_OF_VIEW (default)?
(RELATIVE_TO_SCREEN, RELATIVE_TO_COEXISTENCE)
PhysicalBody: Default should do?
PhysicalEnvironment; not used really?
Is it easier to use compatibility mode?
Regards,
Erik