is okay dev path: first write -desktop with lwjgl, then convert to android?

Converting my game to Libgdx is clearly a big task, and I’m disappointed that my custom-generated sounds would have to go through an audio device that only plays mono and has a large latency (via “AudioDevice” – Playing PCM Audio http://code.google.com/p/libgdx/wiki/AudioDevice).

Am I misreading this, or does sound.play and music.play suffer from the same latency issues cited above? I’m pretty sure at least with sound.play and music.play one has control of volume and panning settings, so stereo must be possible, unlike with AudioDevice.

The coding of AudioDevice and sound.play and music.play all seem to rely directly on LWJGL implementation of OpenAL. I’m downloading source now to poke around and see if I can’t rig up a way to play audio from the functional equivalent of a TargetDataLine and get stereo and decent latencies.

Whatever.

The main question is this: is it reasonable to go about rewriting my game’s graphics via LWJGL’s implementation of OpenGL, and include whatever I can conjure up to my satisfaction for the audio, as a first step, and then, once that is working, go about doing whatever is needed to make an Android-playable version of the game?

I can see that if Libgdx becomes necessary at the second stage, will I at least be able to keep any working LWJGL/OpenGL graphics already coded, and mostly only have to deal with the headache of rewriting the input interface and dealing with screens? Will there be other paths to recoding my MouseMotionListeners and etc without using Libgdx? How much pain are we looking at?

AudioDevice allows stereo playback. The latency is as good as it gets on Android (read horrible). The only other API on Android you can use is OpenSL, which has the exact same latency as AudioDevice.

Regarding GL: if you stick to OpenGL ES functionality on the desktop you’ll be fine. But it’s extremely easy to mes up.

There’s a reason why gdx exists :slight_smile:

@badlogicgames
Thanks for clarifying about the latencies.

Is the latency equally bad for sound.play() effects playback? Bullets/explosions/utility clicks lag over a 1/10th of a second, routinely?

Or does this just apply to generated PCM data playback?

Do we get better latencies with pads as opposed to phones? Do you know generally how much better?

Android audio support has always been pretty awful. :frowning:

As to LWJGL vs LibGDX… I don’t see much of a point in using LWJGL unless you just want to feel like a rebel. LibGDX exposes GL ES, so you can call glXXX functions directly. More likely, though, you will want to use LibGDX’s utilities to increase your productivity and make your time as a developer less painful. Further; if you ever plan to support more than desktop, then LWJGL isn’t really a great option.

That pretty much sums up my stance every time this subject comes up. :wink:

Thanks Davedes, I have a lot of respect for your opinion. (As well as many others here!)

Am feeling stymied though. As far as personal satisfaction, a big part is sound design and composing, or else I start asking myself wtf am I doing here when I could be investing more time just writing music and selling that.

OK, get a grip…

So far, I am really pleased with LWJGL. It is a thing of beauty, an implementation that fully expresses what I love about Java and rekindled my passion for programming a couple years ago.

Two weeks ago I passed a kidney stone. I know, TMI. But the relief I experienced then doesn’t match the relief I am feeling now, letting go of GWT/Android plugins, Libgdx, et al., and discovering that LWJGL is so well written.

On to learning OpenGL!
:slight_smile: