Okay, so I’m making a 2D pixel based game. It uses authentic pixelation, that is, I don’t try to artificial make things look pixelated by drawing “blocky” art. Instead, I’m drawing everything to a small image (320 X 180) and scaling it up to the largest power of 2 that can fit inside the users screen (to preserve square pixels). The problem I’m having is with mapping the positions of objects to the 2D graphics. I use Graphics.drawImage() to draw the graphics to the screen like so:
x, y, width, and height are all floats:
g.drawImage(image, (int) (x - width / 2), (int) (y - width / 2), (int) width, (int) height, null);
X and Y are the positions of the object, with a camera offset subtracted from it. The camera offset is also represented by floats for X and Y. My questions is, is this the best way to map floating point numbers to integers for rendering? Because I will frequently get cases where objects will seem to be out of sync with each other, or something. Like when the camera moves past two stationary objects, they will move across the screen but sort of “jitter” compared to each other. I don’t know if it’s a problem with the way I’m rounding them off to integers or what.
Any advice?