So, I’ve got a particle engine: http://www.funkapotamus.org/shipfountain.gif - That’s a crazy gun that ship has!
And it’s pretty nifty. I get ~7500 particles @ 60fps. I recycle objects and use a custom ArrayList to store particles (objects are stored in an array of particles rather than an array of objects- this eliminates the need to caste objects as particles when they are removed from the list.)
However, I don’t use vertex buffers. Nor do I have any understanding on how to use them. I’ve got many tutorials and examples, however, none of these show me how I can use vertex buffers for multiple quads with thousands of vertecies. Even still, there are questions I’m missing with how vertex buffers are used.
The way I understand them, is you’re supposed to setup your vertecies once. Then, whenever you want to draw them, you merely pass the buffer to OpenGL. I understand this concept if it were to be used with say… a model. Model’s vertecies are relative to one another. If you wanted to move the model, you’d just translate to the position you wanted before you drew the buffer. The point here is that the buffer that stores the model’s verts does not get changed.
Now, if I were to use vertex buffers in my particle engine, how would I go about doing that? I can only think of two ways:
-
Set up neccisary buffers for one quad. Since the particles are all the same size and use the same texture, we can reuse the buffer. Then, whenever I want to draw a particle, I’d translate to it’s world position and draw the buffer. The problem with this is that I could conceivably be doing thousands of glTranslatef calls. I do not want to do this- I’ve already limited my translate calls down to one per particle system per frame. It netted me a 20% speed boost that I would like to keep.
-
Setup a gigantic vertex buffer that holds every vert of every particle in the system. Here’s psudocode:
float floatBuffer[] = new float[activeParticles * 12];
int intBuffer[] = new int[activeParticles * 4]
FloatBuffer vertexBuffer;
int pointer = 0;
int intPointer = 0;
for(int i = 0; i < activeParticles; i++)
{
Particle p = particles[i];
float xOffset = PSystemX + PSystemX - p.xPos;
float yOffset = pSystemY + PSystemY - p.yPos;
//Lower Left Corner
floatBuffer[pointer] = xOffset;
floatBuffer[pointer + 1] = yOffset;
floatBuffer[pointer + 2] = 0.0f // 2D
//Lower Right Corner
floatBuffer[pointer + 3] = xOffset + textureWidth;
floatBuffer[pointer + 4] = yOffset;
floatBuffer[pointer + 5] = 0.0f;
//Upper Left Corner
floatBuffer[pointer + 6] = xOffset;
floatBuffer[pointer + 7] = yOffset + textureHeight;
floatBuffer[pointer + 8] = 0.0f;
//Upper Right Corner
floatBuffer[pointer + 9] = xOffset + textureWidth;
floatBuffer[pointer + 10] = yOffset + textureHeight;
floatBuffer[pointer + 11] = 0.0f;
intBuffer[intPointer] = pointer;
intBuffer[intPointer + 1] = pointer + 3;
intBuffer[intPointer + 2] = pointer + 6;
intBuffer[intPointer + 3] = pointer + 9;
pointer += 12;
intPointer += 4;
}
Assuming that GL is setup properly and the buffer is allocated the way it should, is this a decent method? Is this how vertex buffers are used on a large scale? I guess, what I’m asking for is a more practical use of a vertex buffer. I don’t quite understand how they’re used on anything past a tutorial on how to draw a cube. Which leads me to my next question:
How exactly would you setup a vertex buffer/texture buffer/index buffer on anything more complex than a simple quad for a tutorial? I can’t get anything to work outside tutorials themselves. Perhaps I’m misunderstanding things alltogether?
Big thanks to everyone,
-Funk
