Large Terrain Optimization

Hi!

I am working on my project where I render a very, very large terrain. 40 000 square kilometres. Or about 25 000 square miles. That’s the size of Southwestern Ontario and then a bit. Or about 1/6 the size of the United Kingdom.

Since this is so massively huge, I obviously couldn’t just render it and stick it in a display list. This is because that would take absolutely forever, and would have to do that every time the terrain is changed. That may be a lot, so that is out of the question.

I’ve decided to take the same approach as Minecraft and many other games, by dividing up the terrain into manageable chunks. I then render one of these every frame or so, and store that in a display list. That works really, really well. Up to a point. I was trying to make it so that I could see the entire terrain at once, in a sort of overview mode. I sat and watched the chunks draw and load, and it was fantastic. I still have some work to do with the lighting and the normals, but it looks good.

And then this happened.

That was just the beginning. It spiked to 99% of my CPU usage, and it actually made my sound stop working and my wacom tablet driver to freeze and stop responding. I decided that that wouldn’t make a very good point in a feature list.

And after some fiddling, that brings me here. Asking how any of you managed to wrangle using large terrains. I know that I could just do some trig and such to find out how many chunks are displayed at the current view level and bla bla bla, but I want to see if there are any additional things that I could do, because I want the rendering and graphics to be extremely fast, because the logic is most likely going to be very CPU intensive.

So if you have any ideas or solutions for me, just let me know!

-Jacob

Quadtree and have the data multiresolution.

Hi! Thanks for the quick response!

I really don’t know what a Quadtree is (in terms of 3D graphics) or how I would go about doing anything with them. And having the data be multiresolution probably won’t work, but I thank you for your suggestions and I will most definitely think about how I would go about implementing them!

Thanks
-Jacob

A web search with the keywords multiresolution & terrain should give a ton of hits. The first thing that jumps to my mind is to do something really simple like use Haar wavelets to store progressive resolutions of the terrain data. In that manner your active memory footprint remains manageable regards of the viewing distance.

If your computer froze, didn’t you just run out of RAM and Windows started swapping to your harddrive? Even the sound would stop as your HD is busy. If that’s the problem maybe you just need more RAM (if you only have 2GB I mean)? ;D
You might also have run out of video RAM. You could use GPU-Z to see how much VRAM is in use if you have an NVidia card. BTW, I think both NVidia and ATI have OpenGL extensions to check how much VRAM that is available…

The problem is that I want the game to be able to be run on a lower-end computer. But mine is most definitely not a lower-end computer :D. I have 4GB of RAM, and the pagefile was not being used. My GPU has 1GB of RAM, and I do not think that was the problem, as I wasn’t loading much into it. I guess I will just have to limit the number of chunks to draw at once and do some view frustum culling. Thanks for the help!

Level of detail is the way to do this. Google for the quad tree algorithm should turn up lots of examples. You might have to generate the levels of detail on the fly or compute the levels and save them once so they can be read from disk as needed, but anything with large amounts of terrain just won’t be able to fit completely into memory and leave with room left over (even if you have 4GB of RAM).

Are you leaking memory? :clue:

I don’t think I am leaking memory, I just think that I need to optimize a bit and make a few concessions.