LWJGL/OpenGL Spherical Terrain Generation

I’m trying to do spherical terrain generation in real-time, with multiple LODs. Have any of you tried this? Any recommendations? Useful resources? Also yes it’s intended to be used as a planet, shaders for atmospheric effects/solar flares, etc, all will be done too. If you can help me anywhere, thanks!

Well, i want to do the same, but im focussing on the terrain first.
I fould some code to draw an sphere with quads so i guess im going from there, maybe it can help you to:


    public static void drawSphere(float basex, float basey, float r, int lats, int longs) {
       int i, j;
       float lat0, lat1, z0, z1, zr0, zr1, lng, x, y, prevlat;
       float xc = 1f / lats;
       float yc = 1f / longs;
       
       for(i = 0; i <= lats; i++) {
           prevlat = i - 1;
           lat0 = (pi * (-0.5f + prevlat / lats));
           z0  = SimpleMath.sin(lat0);
           zr0 =  SimpleMath.cos(lat0);
    
           lat1 = (pi * (-0.5f + (float) i / lats));
           z1 = SimpleMath.sin(lat1);
           zr1 = SimpleMath.cos(lat1);
    
           GL11.glBegin(GL11.GL_QUAD_STRIP);
           for(j = 0; j <= longs; j++) {
               lng = (float)pi2 * (float) (j - 1) / longs;
               x = SimpleMath.cos(lng)*r;
               y = SimpleMath.sin(lng)*r;
  
               GL11.glTexCoord2f(xc*j, yc*prevlat);
               GL11.glNormal3f(basex + x * zr0, basey + y * zr0, r * z0);
               GL11.glVertex3f(basex + x * zr0, basey + y * zr0, r * z0);
               
               GL11.glTexCoord2f(xc*j, yc*i);
               GL11.glNormal3f(basex + x * zr1, basey + y * zr1, r * z1);
               GL11.glVertex3f(basex + x * zr1, basey + y * zr1, r * z1);
           }
           GL11.glEnd();
       }
   }

I guess the only thing to do is determine the size of an quad and create an chunk with multiple vertrices instead of an quad.
also the chunk needs to follow an small curve, so it maps smoothly over an sphere.

Hmm, the poles are still a problem, have you seen any reasonable fixes for this?

I guess im just going to use smaller chunks as it gets nearer to the poles, like with textures around an sphere.
With the code i posted im going to check the size of an quad and use these dimensions to determine the chunk size used.
The only problem would be the rotation in all directions, and determining what chunks to generate / draw.

I’ve got a concept in mind, but I’m pretty busy these next few days and may not be able to get anything significant done. I’ll try to keep this thread updated, maybe it’ll help you as well.

Sampling a 3D noise function on the surface of the sphere in the same way you’d sample a 2D noise function on a plane to create a “flat” terrain.

to make the problem with the poles smaller, you can do the following:

I’ve managed to generate a dodecahedron (12 sided using pentagons), but I’m trying to calculate it’s center position (centroid) of each face. I’m slowly melting my face trying to find a proper formula to do this with. Any ways I can do this?

search: “sphere tessellation algorithm”.

Don’t solved problems that have already been solved 100 billion times.

I completely blanked on that word, tessellation, thanks for the tip!

It’s probably worth noting that if you’re using shaders then you can side-step the problem of tessellation since a sphere is a simple implicit surface.

implicit surfaces don’t play nice with shaders… Or am i missing something?

Classically if you want to render a sphere you tessellate the whole thing. With basic shaders you could simply have the geometry of half a sphere and have it always facing the camera, so pretty much only the edges need to be highly tessellated and very little toward the center. Assuming the sphere isn’t going to intersect with anything else, I’m not thinking of any big reason why you couldn’t just render a quad (well, really a pair of tris) sprite…but I don’t recall what the impact on current-ish hardware is for discard. (was a killer for hierarchical depth-buffers at one point)

I’ve never played around with “real” implicit surfaces on the GPU, but I get the impression that demoscene kids are doing it.

Sure you could do it. As long as you don’t do anything else or if they are really small. Its just like raytracing… not the best match to the hardware/pipeline. Geometry shaders could of course tessellate for you which doesn’t require anything to be implicit.

It depends on how you plan to build and interact with the planet. - If you only see the planet from a distance, you can use 3D noise to create a sprite of the planet from any angle.

  • If the density of a vertices doesn’t significantly matter or you need “grid” coordinates, you can distort a cube to a sphere and get planet data from a texture. (You can make planets without visual artifacts, but if you need the grid, it will still be slightly warped.)
  • If the planets are manually designed, just create multiple LOD levels.
  • If you need relatively uniformly distributed vertices, then you can start with an icosahedron and subdivide in the same method as geodesic domes are created.

Performance wise, the cube based method is the most flexible. LOD algorithms relevant to height maps can be applied to it, too, since it’s essentially just 6 side by side.

[quote=“delt0r,post:14,topic:40263”]
creating spheres with simple quads(with shader discard) is easy and should perform quite well. I read a paper about a PS3(?) billiard game some time ago which did exactly that.
Their point was that with this technique a lot more nice effects where possible(cause of the pathtracing), also AA on spheres is a problem which can be solved much better by doing it like this.

Not that I’m suggesting one to do this, but it seems like the work in the fragment shader is about the same if you’re using real geometry vs. some flat imposter. Using a quad will result in something like 22% discard rate…but increasing the sides (even by a small amount) will radically drop this.

this would be of course a nice optimization for imposters