Hi!
I can’t even say that I’m a “newbie” Java programmer, because I’m only in the middle of Sun’s Java tutorial… :-/ But I have a big (hope not tooo big) project in my head. I call it “Optical control for sound synthesis manipulations”. Basically I want to simulate optical effects (reflection and refraction) in 3d and make it control sound synthesis processes. This simulation should transmit OSC (Open Sound Control) messages to control sound synthesis engine which could be made in any application understanding OSC: max/msp, supercollider, reaktor, etc. I post this message here because so far I’ve been searching, JOGL is the only api to provide demos which do on the screen exactly what I’d love to see happening in my app.
For example if to take the VertexBufferObject demo and extract (transmit in OSC format) parameters of the specular reflections on the surface (number, size, brightness, color, position, shape(?)) and make a bandpass filter parameters (bandwidth, center frequency and amplitude) follow the changes on the screen, I think this could result in a simple, but very organic sound. Moreover, sound and visual images are much likely to be perceived by the user as one event, “what you see is what you hear”.
The basic idea expressed in a pseudo-empiric formula is:
(optical events complexity + 3d-space navigation simplicity)*sound metaphor = “ultimate trip experience”?
The questions are:
Do you think it’s posiible?
Is this the best way to do it or maybe you could suggest other options?
Thanx for reading this post to the end
ANY reply/question/comment is GREATLY appreciated.
Ciao!