Currently I see no way of distinguishing userinterface events from events on the canvas. In most apps the UIEventAdapter will catch all events. It uses them for dragging UIWindows or hands it over to the appropriate JComponent. On its way the event itself is not changed, besides its coordinates are translated relative to the UIWindow it occured in. Usually an app has a second event handler (for instance for handling picking). However this event handler has no chance to detect if the event occured in a UIWindow or not (correct me, if I’m wrong here).
Simple example: User clicks somewhere on a UIWindow without triggering a specific action. Usually this should not lead to any reaction within the scene, because the user often doesn’t even know what object he clicked on within the scene.
I have added a simple one-line-patch (see below) to the UIWindowManager, which sets the source of a mouse event to the UIWindow the event occured in. This allows the app to find out if the event occured in a UIWindow and which UIWindow it occured in. This works for me.
How is this supposed to work in Xith3D? In case you had the problem: How did you handle it?
// if we get here, but we have not consumed the mouse message then
// determine if the mouse event should be sent
if (!eventConsumed) {
if (w != null) {
Object o = (Object) w.getOverlay();
if (o instanceof UIWindow) {
me.translatePoint(-(int) w.getRectangle().getX(),
-(int) w.getRectangle().getY());
me.setSource((UIWindow)o);
((UIWindow) o).dispatchEvent(me);
}
}
}
(code always looks ugly in this forum)
