Xith and game production pipeline

Hi, i’m evaluating xith3d from the “game production” perspective and I have some questions for the experts around :

The game is based on “levels” (rooms, walls, doors…) but they remain rather simple (think about a 3d view of an old 2d top-view tiled game).

My pipeline is currently the following :

  • custom level editor with full support for behaviours/interactions and navigation between levels
  • custom “enhancer” plug-in for mapping textures and positionning simple meshes around
    • standard tools to create textures (gimp, photoshop with plug-ins)

When the game begins :

  • unoptimized beta version of a rendering engine (using procedural generation of some parts of the level, to avoid defining everything at level-design time)
  • game engine (event driven), nearly finished

Now I’d like to switch to the following production pipeline :

  • custom level editor => generating 3d models (ASE, other ?)
  • enhancing the level models with 3rd party tools (3ds, blender, etc.)

The game will become Xith-driven :

  • Xith rendering of 3d models
  • same custom engine for gameplay

The most important gains for me are :

  • avoid spending time in the rendering engine
  • give draft 3d models to artists and let these folks beautify them (no more “procedural generation”)
  • spend more time in gameplay

I think I’m missing some information on Xith before beginning :

  1. among the different model loaders available in xith-tk, which ones are the most trusted ?
  2. is keyframe animation/interpolation part of xith (or via third party tools) or will I have to write it ?
  3. I’d rather start updating models with blender as I’m fluent with this tool, are there any issues when exporting from blender to xith ? (best format ?)

Any help welcome, and I’ll keep you posted on my experiments.

Lilian

For static models : ASE is fairly well supported, Wavefront OBJ too.
For models with animations : MD2 is for now the best way to go.

The xith-tk MD2 loader does interpolation very well.

You can do :
Blender >-[Export]-> WavefrontOBJ >-[Load]-> Xith
It look like the texture file isn’t exported, but the UV coordinates are. So you just need to add a ["Kd_map “yourfile.jpg”] (or something like that) line somewhere in your .mtl file.
And for MD2, I don’t know if the pipeline works, I’ve tested one day to export to MD2 from Blender but as I had ~200 frames with a high-poly character it didn’t succeed to export.
But in theory it will work. (I’ll give another try one of these days)

As far as I know, there is no interpolation / key frame support inside Xith.
I think the only available stuff is the recent interpolation at load time added to the MD2 loader.
This works but makes the memory requirements quite a lot higher (compared to what you would need with runtime interpolation, for example with vertex program).
I may be wrong since I stopped using Xith a few months ago. Please correct me if this is the case.

I don’t think it’s a load time interpolation, the MD2Loader really have several classes that contains and compute the interpolation in realtime (IIRC).

Thank you !

I’ll try the static loaders this week (currently finishing my level editor to VRML exporter).

I’ll try animations later (may be next week)

Lilian