is anybody using this system of redering to?!

that you only render what the screen sees ex:

= world

" = screen
& = entity

##########################################

                                          & <-- dont render this

“”"""""""""""""""""""""""
&&

“”""""""""""""""""""""""

##########################################

I’m pretty sure yes…

Yup. It’s called “frustum culling”. It’s even done in 3D games.

I’m pretty sure a majority of people would to save memory. :smiley:

I’m sure a ton of people are, I use that system and it’s critical to my performance. I render everything up-to 100px off the screen (but I scaled everything by 3x because it’s a retro game, so it’s 300 real pixels off screen).

If you want to do any games with a TON of textures/art assets all loaded at once (like mine) it’s basically required, I think. Personally I just say give yourself some buffer room like I did so if any glitches happen the player doesn’t see blackness off to the sides.

Yes.

I ran stress tests on my game and loading in 1.1 million tiles to see how it would run.

1FPS, less if that is even possible.

Implemented culling, solid 60.

This. :wink:

Even my current 128x128 test map… without culling, about 6 FPS, with, solid 60.

I am currently using chunks, so there are entities drawn outside the screen if the chunk is partly inside :slight_smile:

I even go a bit beyond this. I use chunks, and I only render parts of a chunk if these parts are within the bounds of the screen. :wink: