I am having a major problem with the Libgdx 3D Api
here is currently what my game looks like
I am rendering 500 objects with a mesh of
4 vertices
6 Indices
4 Normals (Vector3)
4 Texture Coords(Vector2)
Optimization I have made so far
Frustum Culling
Batch Rendering (Using ModelBatch)
Results I get
Android (Samsung Galaxy Ace 2) around 24 out of 60 FPS
Desktop (AMD FX 8350 processor with an graphics card XFX RADEON 7870) getting around 120 out of 120 FPS
You’re going to have much better luck if you can boil your problem down to a more specific question and a smaller piece of code- preferably an MCVE.
What profiling have you done? What kind of performance are you expecting? What kind of performance are you getting instead? Not just your FPS, but in specific parts of the code. Is a specific line taking longer than you expect it to?
It’s hard to debug your entire project and offer broad statements. It’s much easier to look at specific standalone pieces of example code and comment on those instead.
[quote]What profiling have you done? What kind of performance are you expecting? What kind of performance are you getting instead? Not just your FPS, but in specific parts of the code. Is a specific line taking longer than you expect it to?
[/quote]
If you do some profiling you’ll narrow this down a bunch. Not saying that this is happening to you, but I thought my performance problems were solely based on rendering, but part of it (a big part) turned out to be my ray-triangle picking.
Go download VisualVM and run it. It’ll be worth the effort
The Ace 2 is a low end device from 2012 with a 800mhz dual core CPU. 24fps sounds pretty reasonable for such a thing. How many objects are you actually drawing after the visibility check?
Hmm that is interesting as the ace 2 can run gta vice city quite well but cannot handle 500 object which when is culled about 86 objects at once? which are not really 3D? ??? ???
The first optimization is to sort the meshes based on the material (texture) they use, so as to minimize the state changes when binding them. And next, why not push the objects into the same batch?
In addition to the other suggestions here (create a minimal example, profile, sort to minimize state changes, etc.), could you post the shaders you’re using? I didn’t see them in the materials you linked to (I may have missed them though).
If it’s not the rendering itself, it might be something else in your code. The Dalvik VM reacts strange to some things that are perfectly fine on desktop Java, which is why using VisualVM might not really help here. Dalvik likes micro optimizations a lot.
For my shaders i have been using the default shader from libgdx… I was using my own shader but then libgdx wouldn’t let me optimize my mesh rendering as you can see by my previous posts
Perhaps you could recap why you weren’t able to use your own shader? I don’t know what LibGDX’s default shader does, but it seems possible that it’s doing more work than you actually need, which could be part of the problem. It seems using your own shader (provided you can get it working) would be beneficial in that you’d know exactly what’s going on at the shader level. (Or maybe someone else knows what the LibGDX default shader does and can provide some info on that.)
As long as you don’t measure the actual fps of the GTA game, you are comparing ‘a feeling of smoothness’ with an actual number, which can be misleading. Personally, I would say that 30-60fps is smooth as well. Do you have some log output of the VM? Maybe garbage collection is a problem?
Exception in thread "LWJGL Application" org.lwjgl.opengl.OpenGLException: Cannot use offsets when Element Array Buffer Object is disabled
at org.lwjgl.opengl.GLChecks.ensureElementVBOenabled(GLChecks.java:89)
at org.lwjgl.opengl.GL11.glDrawElements(GL11.java:1117)
at com.badlogic.gdx.backends.lwjgl.LwjglGL20.glDrawElements(LwjglGL20.java:843)
at com.badlogic.gdx.graphics.Mesh.render(Mesh.java:570)
at com.badlogic.gdx.graphics.Mesh.render(Mesh.java:523)
at com.hawk.zomb.engine.shader.DefaultShader.render(DefaultShader.java:76)
at com.badlogic.gdx.graphics.g3d.ModelBatch.flush(ModelBatch.java:213)
at com.badlogic.gdx.graphics.g3d.ModelBatch.end(ModelBatch.java:224)
at com.hawk.zomb.engine.renderEngine.EntityRenderer.render(EntityRenderer.java:75)
at com.hawk.zomb.engine.renderEngine.MasterRenderer.render(MasterRenderer.java:31)
at com.hawk.zomb.game.MainGame.updateRender(MainGame.java:102)
at com.hawk.zomb.config.screen.GameState.render(GameState.java:10)
at com.badlogic.gdx.Game.render(Game.java:46)
at com.badlogic.gdx.backends.lwjgl.LwjglApplication.mainLoop(LwjglApplication.java:214)
at com.badlogic.gdx.backends.lwjgl.LwjglApplication$1.run(LwjglApplication.java:120)
shader code
package com.hawk.zomb.engine.shader;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.Camera;
import com.badlogic.gdx.graphics.Color;
import com.badlogic.gdx.graphics.GL20;
import com.badlogic.gdx.graphics.Texture;
import com.badlogic.gdx.graphics.g3d.Renderable;
import com.badlogic.gdx.graphics.g3d.Shader;
import com.badlogic.gdx.graphics.g3d.attributes.ColorAttribute;
import com.badlogic.gdx.graphics.g3d.attributes.TextureAttribute;
import com.badlogic.gdx.graphics.g3d.utils.RenderContext;
import com.badlogic.gdx.graphics.glutils.ShaderProgram;
import com.hawk.zomb.config.GameWindow;
import com.hawk.zomb.engine.utils.Utils;
public class DefaultShader implements Shader {
private ShaderProgram shader;
@Override
public void dispose() {
shader.dispose();
}
@Override
public void init() {
shader = GameWindow.getDefaultShader();
}
@Override
public int compareTo(Shader other) {
// TODO Auto-generated method stub
return 0;
}
@Override
public boolean canRender(Renderable instance) {
return true;
}
@Override
public void begin(Camera camera, RenderContext context) {
shader.begin();
shader.setUniformMatrix(Utils.COMBINED_UNIFORM, camera.combined);
}
@Override
public void render(Renderable renderable) {
shader.setUniformi(Utils.TEXTURE0_UNIFORM, 0);
shader.setUniformMatrix(Utils.TRANSFORM_UNIFORM, renderable.worldTransform);
try{
TextureAttribute t = (TextureAttribute) renderable.material.get(TextureAttribute.Diffuse);
if (t != null) {
int texID = t.uvIndex;
Gdx.gl20.glBindTexture(0, texID);
shader.setUniformi(Utils.ENABLETEXTURE_UNIFORM, 1); // Use Texture
}
}catch(NullPointerException e){
shader.setUniformi(Utils.ENABLETEXTURE_UNIFORM, 0); // Dont Use Texture
}
try {
Color color = ((ColorAttribute) renderable.material.get(ColorAttribute.Diffuse)).color;
if (color != null) {
shader.setUniformf(Utils.COLOUR_UNIFORM, color.r, color.g, color.b);
}
} catch (NullPointerException e) {
shader.setUniformf(Utils.COLOUR_UNIFORM, 1, 1, 1, 1);
}
renderable.mesh.render(shader, renderable.primitiveType, renderable.meshPartOffset, renderable.meshPartSize);
}
@Override
public void end() {
shader.end();
}
}
You can see garbage collection activity in the log output. If you connect your device to your PC, you should be able to see it in your IDE (in Eclipse, it’s in the ddms view).
Also Transparency is not working as it should i am using ModelBatch
It worked perfect when using my rendering engine (However i was calling the mesh one by one)