writing and loading large data sets

hi i want to stream data, from a medium(normally the HD) but the data set is pretty large over 200 mb at least. So there has to be a ‘intelligent’ partioning system. I though of partion my data into square sized tiles. and determine what’s need to be loaded by given the camera position and frustum, and then load those tiels inside the frustum.

I think it involves creating a binary data file that will hold offset(x,y coords??) of the tiles and then load them in a apart thread and give it to the render?

any help/links:)

cheers,

Paul

Even this topic is a little outdated I will give a hint. You did not say what kind of data you are talking about but I guess it is a kind of terrain rendering problem you are talking about. See this document which might help you understand how terrain is rendered in an efficient way including such cool features like detail enhancement and out-of-core rendering.

Rendering Very Large, Very Detailed Terrain

Hope this helps (if not you then maybe others who take a look at this topic)

Ragosch

it is used for rendering pointclouds, 100 millions of them, but pretty sustainable frame rates can be achieved with memory mapped buffers in combination with gldrawelements.

Paul