Vertex buffers and particle engines.

So, I’ve got a particle engine: http://www.funkapotamus.org/shipfountain.gif - That’s a crazy gun that ship has!

And it’s pretty nifty. I get ~7500 particles @ 60fps. I recycle objects and use a custom ArrayList to store particles (objects are stored in an array of particles rather than an array of objects- this eliminates the need to caste objects as particles when they are removed from the list.)

However, I don’t use vertex buffers. Nor do I have any understanding on how to use them. I’ve got many tutorials and examples, however, none of these show me how I can use vertex buffers for multiple quads with thousands of vertecies. Even still, there are questions I’m missing with how vertex buffers are used.

The way I understand them, is you’re supposed to setup your vertecies once. Then, whenever you want to draw them, you merely pass the buffer to OpenGL. I understand this concept if it were to be used with say… a model. Model’s vertecies are relative to one another. If you wanted to move the model, you’d just translate to the position you wanted before you drew the buffer. The point here is that the buffer that stores the model’s verts does not get changed.

Now, if I were to use vertex buffers in my particle engine, how would I go about doing that? I can only think of two ways:

  1. Set up neccisary buffers for one quad. Since the particles are all the same size and use the same texture, we can reuse the buffer. Then, whenever I want to draw a particle, I’d translate to it’s world position and draw the buffer. The problem with this is that I could conceivably be doing thousands of glTranslatef calls. I do not want to do this- I’ve already limited my translate calls down to one per particle system per frame. It netted me a 20% speed boost that I would like to keep.

  2. Setup a gigantic vertex buffer that holds every vert of every particle in the system. Here’s psudocode:


float floatBuffer[] = new float[activeParticles * 12];
int intBuffer[] = new int[activeParticles * 4]
FloatBuffer vertexBuffer;

int pointer = 0;
int intPointer = 0;
for(int i = 0; i < activeParticles; i++)
{
   Particle p = particles[i];

   float xOffset = PSystemX + PSystemX - p.xPos;
   float yOffset = pSystemY + PSystemY - p.yPos;

   //Lower Left Corner
   floatBuffer[pointer] = xOffset;
   floatBuffer[pointer + 1] = yOffset;
   floatBuffer[pointer + 2] = 0.0f // 2D

   //Lower Right Corner
   floatBuffer[pointer + 3] = xOffset + textureWidth;
   floatBuffer[pointer + 4] = yOffset;
   floatBuffer[pointer + 5] = 0.0f;

   //Upper Left Corner
   floatBuffer[pointer + 6] = xOffset;
   floatBuffer[pointer + 7] = yOffset + textureHeight;
   floatBuffer[pointer + 8] = 0.0f;

   //Upper Right Corner
   floatBuffer[pointer + 9] = xOffset + textureWidth;
   floatBuffer[pointer + 10] = yOffset + textureHeight;
   floatBuffer[pointer + 11] = 0.0f;

   intBuffer[intPointer] = pointer;
   intBuffer[intPointer + 1] = pointer + 3;
   intBuffer[intPointer + 2] = pointer + 6;
   intBuffer[intPointer + 3] = pointer + 9;

   pointer += 12;
   intPointer += 4;
}

Assuming that GL is setup properly and the buffer is allocated the way it should, is this a decent method? Is this how vertex buffers are used on a large scale? I guess, what I’m asking for is a more practical use of a vertex buffer. I don’t quite understand how they’re used on anything past a tutorial on how to draw a cube. Which leads me to my next question:

How exactly would you setup a vertex buffer/texture buffer/index buffer on anything more complex than a simple quad for a tutorial? I can’t get anything to work outside tutorials themselves. Perhaps I’m misunderstanding things alltogether?

Big thanks to everyone,

-Funk

Method ‘2’ is the one you’ll want to go for. You’re pretty much stuck with regenerating the buffers every frame 'cos thats just how particles behave. You could potentially use point sprites (which means only a single vertex per particle) but last time I looked at them they had all sorts of nasty limitations.

You should look at which buffers need updating as well. Obviously positions will need updating every frame, as will vertex colours probably. Texture coords could probably be static (unless you’ve got some kind of animation) and indices can definatly be re-used (potentially between different particle systems as well).

Not to change the subject

But have anyone used the GL_ARB_point_sprite extension in LWJGL??

[quote]Not to change the subject

But have anyone used the GL_ARB_point_sprite extension in LWJGL??
[/quote]
That extension is currently borked on anything but Nvidia cards, don’t bother using it.

On topic;
I’ve been experimenting a lot with particles in the past couple of days and I managed to created some very realistic flames. The only draw back is, I converted my LWJGL engine to use C++ instead of Java :P. However that shouldn’t stop anyone of you from converting the particles portion of the code back to Java.

Pics
http://www.realityflux.com/abba/C++/Dynamic%20Reflection/cubemapreflection.jpg

Exe
http://www.realityflux.com/abba/C++/Dynamic%20Reflection/Reflection%20CM.zip

source :smiley:
http://www.realityflux.com/abba/C++/Dynamic%20Reflection/GLParticles.cpp
http://www.realityflux.com/abba/C++/Dynamic%20Reflection/GLParticles.h

Vertex Program for particles billboard


char *partsBillboardCode = { "!!ARBvp1.0\n"
                             "ADDRESS coeffIndices;\n"     
                             "PARAM   modelViewProjection[4] = {     state.matrix.mvp      };\n"
                             "PARAM   verticesOffsets[4]     = { {-1.0,-1.0 }, { 1.0,-1.0 },\n"
                             "                                   { 1.0, 1.0 }, {-1.0, 1.0 }};\n"
                             "PARAM   modelView[4]           = {   state.matrix.modelview  };\n"
                             "PARAM   particlesSize          = program.env[0];\n"
                             "TEMP    newLocation, newUpVector;\n"

                             "ARL     coeffIndices.x, vertex.texcoord[0].z;\n"

                             "MUL     newUpVector  , modelView[1]  , verticesOffsets[coeffIndices.x].y;\n"
                             "MAD     newLocation  , modelView[0]  , verticesOffsets[coeffIndices.x].x, newUpVector;\n"
                             "MAD     newLocation  , particlesSize , newLocation, vertex.position;\n"
                             "MOV     newLocation.w, 1.0;\n"

                             "DP4     result.position.x, modelViewProjection[0], newLocation;\n"
                             "DP4     result.position.y, modelViewProjection[1], newLocation;\n"
                             "DP4     result.position.z, modelViewProjection[2], newLocation;\n"
                             "DP4     result.position.w, modelViewProjection[3], newLocation;\n"

                             "MOV     result.color, vertex.color;\n"
                             "MOV     result.texcoord[0], vertex.texcoord[0];\n"

                             "END\n"},

Do vertex buffers even work in ortho mode? I’ve tried all I could and I can’t quite seem to get things going. I ported a working test into ortho mode and things broke.

InitGL:


      GL11.glMatrixMode(GL11.GL_PROJECTION);
            GL11.glLoadIdentity();

            GL11.glOrtho(0, Display.getDisplayMode().getWidth(), Display.getDisplayMode().getHeight(), 0, -1000, 1000);
      

Called Each Frame:


                  GL11.glClear(GL11.GL_COLOR_BUFFER_BIT | GL11.GL_DEPTH_BUFFER_BIT);
                  GL11.glLoadIdentity();

GL11.glLoadIdentity();

              // Transformations
            GL11.glTranslatef(200.0f, 200.0f, 0.0f);
            //GL11.glRotatef(rot, 1.0f, 0.75f, 0.30f);

              // Send float buffers to LWJGL.
            GL11.glVertexPointer(3, 0, bufferVertex);
            GL11.glColorPointer(3, 0, bufferColor);

              // Draw elements :)
            GL11.glDrawElements(GL11.GL_QUADS, bufferIndice);

              // Inscrease rotation angle.
            rot += 0.4f;

Buffer Setup:

      public final float cube_vertices[] =
            {
                              -2.0f,      2.0f, 0.0f,                                    // Front Face Top Left.
                               2.0f,      2.0f, 0.0f,                                    // Front Face Top Right.
                               2.0f, -2.0f, 0.0f,                                    // Front Face Bottom Right.
                              -2.0f, -2.0f, 0.0f,                                    // Front Face Bottom Left.

                              -2.0f,      2.0f, 0.0f,                              // Back Face Top Left.
                               2.0f,      2.0f, 0.0f,                              // Back Face Top Right.
                               2.0f, -2.0f, 0.0f,                              // Back Face Bottom Right.
                              -2.0f, -2.0f, 0.0f,                              // Back Face Bottom Left.
            };

              // Color array.
            public final float cube_colors[] =
            {
                               1.0f,      0.0f,   0.0f,
                               0.0f,      1.0f,   0.0f,
                               0.0f,      0.0f,      1.0f,
                               1.0f,      1.0f,      0.0f,

                               1.0f,      0.0f,      1.0f,
                               0.0f,      1.0f,      1.0f,
                               0.0f,      0.0f,      0.0f,
                               1.0f,      1.0f,      1.0f,
            };

              // Cube indexes to tell OpenGL how to draw vertices.
            public final int cube_indexes[] =
            {
                               0, 3, 2, 1,                                          // Front Face
                               5, 6, 7, 4,                                          // Back Face
                               4, 0, 1, 5,                                          // Top Face
                               3, 7, 6, 2,                                          // Bottom Face
                               4, 0, 3, 7,                                          // Left Face
                               1, 2, 6, 5,                                          // Right Face
      };

      public FloatBuffer bufferVertex = BufferUtils.createFloatBuffer(24);
      public FloatBuffer bufferColor      = BufferUtils.createFloa      public FloatBuffer bufferVertex = BufferUtils.createFloatBuffer(24);
      public FloatBuffer bufferColor      = BufferUtils.createFloatBuffer(24);
      public IntBuffer bufferIndice      = BufferUtils.createIntBuffer(24);tBuffer(24);
      public IntBuffer bufferIndice      = BufferUtils.createIntBuffer(24);

A little side question. Is there a way to take an ortho X/Y coordinate and translate it into a projection coordinate? My game is 2D but my particle engine exists in projection mode so I can make some niftier effects. I want to position particle systems relative to objects drawn on the screen in ortho mode but I don’t know the way to make them line up correctly.

@side question

You can setup your perspective in a way that you get an area with 1:1 pixels at z=0. That way you can do your 2d stuff (z always=0) and you can also throw some 3d stuff into it without changing anything.