LibGDX tile rendering flicker.

What’s up everybody? :smiley:

Ok, so i am making a top-down 2(.5)D game. My “floor” tiles are stored in a WxH array (not tiles, but their id’s). When I want to render them, I use two fors (y and x) get the texture for that tile id and render it at x * Tile.TILE_SIZE (int constant). To optimize the algorithm, i calculate a start and end x and y based on camera’s position (a float transformed to a integer), and render only tiles that are on the screen.

Problem:
The problem is that when i move the camera (using a calculated delta – huge decimal number) i get a visual bug: 1 pixel gap between random rows of tiles (only horizontal gap). My assumption is that the problem is caused by that float -> integer transformation.

Could anyone tell me how can i fix this problem? Also, how can i avoid this type of problems?
Thanks!

LE: Here’s one picture of what is happening.

When I was working with TileMaps in libgdx, that problem only occured when I zoomed in the camera, but just rendering (no matter what tilesize I specified) was smooth. I never got around to fixing it because I started working on something else.

Disable bilinear filtering.

I don’t even think i am using bilinear filtering. Isn’t default in libGDX to use GL_NEAREST when loading TextureAtlases ?

I fonud as a fix casting to int the camera’s coordonates, but this is messing my camera code (i am calculating a vector to player’s position for a smooth camera effect). Any fix for this? :smiley:

Here’s my code:


private void updateCameraPosition() {
	float targetX, targetY;
	
	if(player.getPosition().x - camera.viewportWidth / 2 < 0) targetX = camera.viewportWidth / 2;
	else if(player.getPosition().x + camera.viewportWidth / 2 > world.getLevel().getWidth() * Tile.TILE_SIZE) 
		targetX = world.getLevel().getWidth() * Tile.TILE_SIZE - camera.viewportWidth / 2;
	else targetX = player.getPosition().x;
	
	if(player.getPosition().y - camera.viewportHeight / 2 < 8) targetY = camera.viewportHeight / 2 + 8;
	else if(player.getPosition().y + camera.viewportHeight / 2 > world.getLevel().getHeight() * Tile.TILE_SIZE + 8) 
		targetY = world.getLevel().getHeight() * Tile.TILE_SIZE - camera.viewportHeight / 2 + 8;
	else targetY = player.getPosition().y;
	
	float dx = targetX - camera.position.x, dy = targetY - camera.position.y, dist = (float) Math.hypot(dx, dy);
	Vector3 cameraVector = new Vector3((float) Math.cos(Math.atan2(dy, dx)), (float) Math.sin(Math.atan2(dy, dx)), 0);
	cameraVector.mul(Math.max(dist / 25f, .2f));
	
	camera.position.add(cameraVector);
	camera.update();
}