Depth testing isn't working

Also,

@Override
public void init() throws Throwable {
	super.init();
	GL11.glClearColor(0.2f, 0.2f, 0.2f, 1);
	GL11.glClearDepth(1d);
	rootObject.init();
	
	GL11.glEnable(GL11.GL_DEPTH_TEST);
	
	ambient = new Program();
	ambient.attachShader((ShaderAsset)getLoader().getAsset("program-world-ambient-fs"));
	ambient.attachShader((ShaderAsset)getLoader().getAsset("program-world-ambient-vs"));
	ambient.link();
}

@Override
public void render() throws Throwable {
	super.render();
	GL11.glClear(GL11.GL_COLOR_BUFFER_BIT | GL11.GL_DEPTH_BUFFER_BIT | GL11.GL_STENCIL_BUFFER_BIT);
	ambient.use();
	
	rootObject.render(ambient);
	
	GL11.glEnable(GL11.GL_BLEND);
	GL11.glBlendFunc(GL11.GL_ONE, GL11.GL_ONE);
	GL11.glDepthMask(false);
	GL11.glDepthFunc(GL11.GL_EQUAL);
	
	for(int i = 0; i < lights.size(); i++){
		lights.get(i).use();
		rootObject.render(lights.get(i).getProgram());
	}
	
	GL11.glDepthFunc(GL11.GL_LEQUAL);
	GL11.glDepthMask(true);
	GL11.glDisable(GL11.GL_BLEND);
}

Produces some wierd problems like this:

it’s related to [icode]glDepthMask[/icode] (write to depth buffer) and draw order.

looks like it’s not writing depth but still testing, which is valid, but probably not what you wanted :slight_smile:

I’ve heard about draw orders before… Somewere.

How would I go about fixing this? This is my first time having this problem, and I don’t know how I got it :open_mouth:

One step forward, two steps back. :wink:

depends on what image you want to make :wink:

what if you enabled depth test and depth write … and disable backface culling [icode]GL11.glDisable(GL11.GL_CULL_FACE);[/icode] ? the sphere should not be visible then when camera is below the plane.

edit

draw order would not be important then anymore. just affecting performance but not the output image.

it’s only important if you do depth-testing with scenes which contains objects which are not writing to the depth buffer … or the other way around.

No effect… It looks like the ball is transparent from before too.

sounds like shader/blending mismatch.

try

GL14.glBlendEquation(GL14.GL_FUNC_ADD);
GL11.glBlendFunc(GL11.GL_SRC_ALPHA,GL11.GL_ONE_MINUS_SRC_ALPHA);

which is “normal alpha blending” and write alpha [icode]1.0[/icode] in the shader.

yet, the sphere should be occluded (camera below plane) with depth testing and writing enabled O_o.

oh wait … maybe it’s the depth compare function. try [icode]GL11.glDepthFunc(GL11.GL_LEQUAL);[/icode]

Heres my code so far,

Init:

GL11.glEnable(GL11.GL_DEPTH_TEST);
GL11.glDepthMask(true);
GL11.glDepthRange(0.0f, 1.0f);
GL11.glDisable(GL11.GL_CULL_FACE);
GL11.glDepthFunc(GL11.GL_LEQUAL);

Render:

GL11.glClear(GL11.GL_COLOR_BUFFER_BIT | GL11.GL_DEPTH_BUFFER_BIT);
GL11.glEnable(GL11.GL_BLEND);
GL14.glBlendEquation(GL14.GL_FUNC_ADD);
GL11.glBlendFunc(GL11.GL_SRC_ALPHA,GL11.GL_ONE_MINUS_SRC_ALPHA);

ambient.use();

rootObject.render(ambient);

GL11.glBlendFunc(GL11.GL_ONE, GL11.GL_ONE);
GL11.glDepthMask(false);
GL11.glDepthFunc(GL11.GL_EQUAL);

for(int i = 0; i < lights.size(); i++){
	lights.get(i).use();
	rootObject.render(lights.get(i).getProgram());
}

GL11.glDepthFunc(GL11.GL_LESS);
GL11.glDepthMask(true);
GL11.glDisable(GL11.GL_DEPTH_TEST);

remove line 10 - 12 and 19 - 21 from the 2nd code block :slight_smile:

GL11.glClear(GL11.GL_COLOR_BUFFER_BIT | GL11.GL_DEPTH_BUFFER_BIT);
GL11.glEnable(GL11.GL_BLEND);
GL14.glBlendEquation(GL14.GL_FUNC_ADD);
GL11.glBlendFunc(GL11.GL_SRC_ALPHA,GL11.GL_ONE_MINUS_SRC_ALPHA);

ambient.use();

rootObject.render(ambient);

for(int i = 0; i < lights.size(); i++){
   lights.get(i).use();
   rootObject.render(lights.get(i).getProgram());
}

… wtf!

My init code if it helps.

GL11.glEnable(GL11.GL_DEPTH_TEST);
GL11.glDepthMask(true);
GL11.glDepthRange(0.0f, 1.0f);
GL11.glDisable(GL11.GL_CULL_FACE);
GL11.glDepthFunc(GL11.GL_LEQUAL);

looks fine to me… almost out of ideas. sorry :emo:

are you writing depth in the fragment shader ?

Donno…

Ambient shader:

#include "./asset/shader/global.fsh"

vec4 calcPixel()
{
	return texture2D(u_material.map_Kd, v_texcoord);
}

Point Light Shader:

#include "./asset/shader/light.fsh"

uniform PointLight u_light;

uniform vec3 u_lPosition;
varying vec3 l_position;

vec4 calcLightModule(vec3 p_normal, vec3 p_position)
{
	vec3 l_direction = l_position - p_position;
	float dist = length(l_direction);
	if(dist > u_light.range) return vec4(0, 0, 0, 0);
	
	return lightModule(u_light.base, u_material, normalize(l_direction), p_normal, p_position) / 
	                                   (u_light.atten.constant + u_light.atten.linear * dist
	                                    + u_light.atten.exponent * dist * dist + 0.0001);
}

The #include just takes another file, and loads it into the final string before shader compilation.

mhm … what i ment is the part which writes fragment outputs like [icode]out vec4 frag0;[/icode] or [icode]gl_FragColor[/icode] or [icode]gl_FragDepth[/icode].

I tried outputting just the depth, and it does this:

return vec4(gl_FragDepth, gl_FragDepth, gl_FragDepth, 1);

GL11.glClear(GL11.GL_COLOR_BUFFER_BIT | GL11.GL_DEPTH_BUFFER_BIT);

GL11.glEnable(GL11.GL_DEPTH_TEST);
GL11.glDepthRange(0.0f, 1.0f);
GL11.glDepthMask(true);
GL11.glDepthFunc(GL11.GL_LEQUAL);

ambient.use();

rootObject.render(ambient);

ah no no. best thing is not to touch gl_FragDepth at all.

if, it’s a output variable. but if you write to it you disable a couple GL optimization. https://www.opengl.org/sdk/docs/man/html/gl_FragDepth.xhtml

it would explain the visual glitch. just a guess :slight_smile:

So what are you saying? What should I do next? I’m still very new to depth testing.

i suggest not writing to gl_FragDepth :slight_smile:

otherwise, your GL code looks fine to me. maybe somebody else can spot the source. :expressionless: