Im thinking about rendering an incoming rtp stream using jmf onto a cube in xith3d. Has anyone else tried this and what was your experiance?
Was it slow?
Did it render well?
Anything else?
Thanks…
Yes, I have a video system in production on live TV.
Do not use JMF.
Instead we ended up writing our own JNI bindings to NCT’s video activeX control. We don’t use Xith (it’s all direct-to-the metal LWJGL) but that shouldn’t make any difference. The real issue is that JMF is a very poor implementation indeed.
Cas 
Why is LWJGL better than xith?
It’s not, it’s just a bare metal API for doing OpenGL, not a scenegraph like Xith.
Xith is available on top of either LWJGL or JOGL.
Cas 
[quote]Why is LWJGL better than xith?
[/quote]
Why do you ask? Cas didn’t write that LWJGL ist better than Xith3D (they are not directly comparable anyway).
am new, hear that this this is better than that, etc etc etc
You’ve got quite a few posts on the board asking various questions… what project(s) are you working on, etc.?
Cas 
Iv allready built a webcam internet phone system running on JMF, works well!
Im currently building a web multiplayer tenpin bowling game using xith3d and odejava.
I’m glad you got JMF working eventually
I had a lot less control of what format data I had to display.
Cas 
Hi princec,
if you have some experience with displaying tv images using lwjgl, is it possible to give me some hints about it.
I do image processing in c using video4linux2 and I am planning to use Xith for some visualization. Using JNI how do I get the image data into a texture ? What data format is best (I get plain RGB24 or RGB32 from the card) ?
Any ideas ?
Thanks,
Ca$cade
yer spill it dude 
i used QuickTime for Java but its all change there with
Java1.4 so now i tend to not bother with video unless
its of a giggle. QTJ is to slowwww… and all i want is the
data inputs pixels… then i can pass the data image
data to wherever. may have to decompress the data so
it can be used.
camera -> codec - > DirectBuffer ??
It’s pretty crap that there is no simple and free way to display a mpeg4 or similar formatted video in Java across all three platforms. I found a wounderful IBM pure-java mpeg4 codec which did everything, BUT it cost over $4,000 :o
Will.
hmm, don’t know if this applies to my problem. I get a raw image buffer (RGB24 or BGR24) from my framegrabber into ram (C language level).
I am planning to use JNI to get the buffer mapped into java and subsequently displayed as a texture.
I hope this works … anyone having experiences on how to scale an 762x576 (PAL) image to an appropriate texture size ? perhaps 512x512 ?
Thanks for hints and infos,
Ca$cade
I would like to hear as well how to stream a frequent image frames to a texture. Let’s say one side of the box should display a stream data.
What I don’t get is how to update an existing texture in OpenGL and still have a decent framerate. If one want to run AVI video or webcam picture it should run 10-24 frames minimum.
One question,
what is the best way to update a texture with a live video feed (via jni) ?
ImageComponent
ImageComponent.BufferedImage
Any ideas ? Anyone done that already ?
thanks,
Ca$cade