Hi there!
I’m trying to incorporate a post processing shader in my first project. So I set up a FrameBuffer, drawing onto it, and then drawing a texture from it onto screen.
But when I render directly on screen, I get this:
https://thumb.ibb.co/h4i1Jb/cool.jpg
When I render using FrameBuffer, I get this:
https://thumb.ibb.co/gdgz4G/shit.jpg
Notice how different edges of bricks look. As far as I understand, in the second case there is no texture filtering? Or what is causing this is more tricky?
I’m using LibGdx.
Textures are loaded like this:
brickTexture = new Texture(Gdx.files.internal(BRICK));
FrameBuffer setup:
frameBuffer = FrameBuffer.createFrameBuffer(
Pixmap.Format.RGBA8888,
Gdx.graphics.getWidth(),
Gdx.graphics.getHeight(),
false);
And my render() method:
@Override
public void render(float delta) {
camera.update();
if (paused) {
menu.act(delta);
} else {
game.act(delta);
}
frameBuffer.begin();
Gdx.gl.glClearColor(0, 0, 0, 1);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
if (paused) {
menu.draw();
} else {
batch.begin();
game.draw(batch);
batch.end();
}
frameBuffer.end();
viewport.apply();
Gdx.gl.glClearColor(0, 0, 0, 1);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
batch.begin();
batch.draw(frameBuffer.getColorBufferTexture(), 0, WORLD_HEIGHT, WORLD_WIDTH, -WORLD_HEIGHT);
batch.end();
}