Strange rendering fail

Hi guys!

Currently I’m setting up the basic level rendering.
But a strange error occurred: the tile image bottom line shows at the top of the tile (source image size: 64x64).
the y-coords (i think): 64, 1, 2, 3, …
I also searched on google, but the solution does not work, 'cause my images ARE pot-sizes.

Does someone know a solution?

Here’s a image of the problem:
http://imageshack.us/photo/my-images/854/renderingfail.png/

does it do that for every instance of the tile, or does it do this for every tile.

It is doing it every time drawing a tile.
I have an array of integers which represents the index of an array of images.
e.g.:
int index = tilemap[xCoord][yCoord];
images[index].draw();
I also tried that code with Slick2D, same error.

it could possibly be a problem with the texture loader.
what is your draw() code?

The texture reader should not be the problem, 'cause the error occurred as well as I used Slick2D.
But thank deepthought for the hint. Well, actually it was the draw method.
I was drawing the image with floating coord values but as it seems there is no problem if i use integer values.

Think I can close the topic.

Thank you for the answers! :slight_smile:

hmm. strange that it would do that.

The problem seemed to be solved, but it isn’t. >:(
When I scale the tiles with glScalef(…) it does not render correct.
The background image is scaled over the whole window and the lowest row of pixels is also at the top too.
Is it a problem to scale images?
Cause I get the same problem with slick2d.

Hope someone can help.

So…

Here’s the TextureLoader:

public class TextureLoader {
    /* singleton */
    private static TextureLoader instance;
    /* Contains the path of the Texture and the Texture itself */
    private HashMap<String, Texture> textureMap;
    /* the color models for alpha and non-alpha images */
    private ColorModel glAlphaColorModel;
    private ColorModel glColorModel;

    // initialize fields
    private TextureLoader() {
        textureMap = new HashMap<String, Texture>();
        glAlphaColorModel = new ComponentColorModel(ColorSpace.getInstance(ColorSpace.CS_sRGB),
                new int[]{8, 8, 8, 8},
                true,
                false,
                ComponentColorModel.TRANSLUCENT,
                DataBuffer.TYPE_BYTE);

        glColorModel = new ComponentColorModel(ColorSpace.getInstance(ColorSpace.CS_sRGB),
                new int[]{8, 8, 8, 8},
                false,
                false,
                ComponentColorModel.OPAQUE,
                DataBuffer.TYPE_BYTE);
    }
    
    // return the singleton
    public static synchronized TextureLoader getInstance() {
        if (instance == null) {
            instance = new TextureLoader();
        }
        return instance;
    }

    public Texture load(String path) throws IOException {
        Texture texture = textureMap.get(path);

        if (texture == null) {
            texture = getTexture(path,
                    GL11.GL_TEXTURE_2D,
                    GL11.GL_RGBA,
                    GL11.GL_LINEAR,
                    GL11.GL_LINEAR);

            textureMap.put(path, texture);
        }

        return texture;
    }

    private Texture getTexture(String path, int target, int dstPixelFormat, int minFilter, int magFilter) throws IOException {
        int srcPixelFormat;
        int textureID = createTextureID();

        Texture texture = new Texture(target, textureID);

        GL11.glBindTexture(target, textureID);

        BufferedImage image = ImageLoader.load(path);
        texture.setWidth(image.getWidth());
        texture.setHeight(image.getHeight());

        boolean hasAlpha = image.getColorModel().hasAlpha();
        if (hasAlpha) {
            srcPixelFormat = GL11.GL_RGBA;
        } else {
            srcPixelFormat = GL11.GL_RGB;
        }

        ByteBuffer buffer = convertImageData(image, texture, hasAlpha);

        if (target == GL11.GL_TEXTURE_2D) {
            GL11.glTexParameteri(target, GL11.GL_TEXTURE_MIN_FILTER, minFilter);
            GL11.glTexParameteri(target, GL11.GL_TEXTURE_MAG_FILTER, magFilter);
        }

        GL11.glTexImage2D(target,
                0,
                dstPixelFormat,
                Number.getPowerOfTwo(2, image.getWidth()),
                Number.getPowerOfTwo(2, image.getHeight()),
                0,
                srcPixelFormat,
                GL11.GL_UNSIGNED_BYTE,
                buffer);

        return texture;

    }

    private int createTextureID() {
        return GL11.glGenTextures();
    }

    private ByteBuffer convertImageData(BufferedImage image, Texture texture, boolean hasAlpha) {
        ByteBuffer buffer;

        WritableRaster raster;
        BufferedImage textureImage;

        int textureWidth = Number.getPowerOfTwo(2, image.getWidth());
        int textureHeight = Number.getPowerOfTwo(2, image.getHeight());

        texture.setTextureWidth(textureWidth);
        texture.setTextureHeight(textureHeight);

        int bands = 3;
        ColorModel textureColorModel = glColorModel;

        if (hasAlpha) {
            bands = 4;
            textureColorModel = glAlphaColorModel;
        }

        raster = Raster.createInterleavedRaster(DataBuffer.TYPE_BYTE,
                textureWidth,
                textureHeight,
                bands,
                null);
        textureImage = new BufferedImage(textureColorModel, raster, false, new Hashtable());

        Graphics g = textureImage.createGraphics();
        g.setColor(new Color(0, 0, 0, 0));
        g.fillRect(0, 0, textureImage.getWidth(), textureImage.getHeight());
        g.drawImage(image, 0, 0, textureImage.getWidth(), textureImage.getHeight(), null);

        byte[] data = ((DataBufferByte) textureImage.getRaster().getDataBuffer()).getData();

        int dataLength = data.length;

        buffer = ByteBuffer.allocateDirect(dataLength);
        buffer.order(ByteOrder.nativeOrder());
        buffer.put(data, 0, dataLength);
        buffer.flip();


        return buffer;
    }
}

That’s how I draw the texture.


GL11.glEnable(GL11.GL_TEXTURE_2D);
        GL11.glBindTexture(GL11.GL_TEXTURE_2D, texture.getTextureID());
        
        GL11.glEnableClientState(GL11.GL_VERTEX_ARRAY);
        GL11.glEnableClientState(GL11.GL_TEXTURE_COORD_ARRAY);
        
        
        if(rotation != 0) {
            GL11.glPushMatrix();
            GL11.glTranslatef(centerX, centerY, 0);
            GL11.glRotatef(rotation, 0, 0, 1);
            GL11.glTranslatef(-centerX, -centerY, 0);
        }
        
        GL11.glVertexPointer(2, 0, vertexBuffer);
        GL11.glTexCoordPointer(2, 0, textureBuffer);
        
        GL11.glDrawArrays(GL11.GL_QUADS, 0, 4);
        
        if(rotation != 0) {
            GL11.glPopMatrix();
        }
        
        GL11.glDisableClientState(GL11.GL_TEXTURE_COORD_ARRAY);
        GL11.glDisableClientState(GL11.GL_VERTEX_ARRAY);
        GL11.glDisable(GL11.GL_TEXTURE_2D);

I want to solve the problem as soon as possible, cause its annoying to test with it.
I’ve really no idea what the problem is or how to solve it.

A image was posted before, hope this is enough illustrative material.

I didn’t see anything that I believe could cause your error, although you haven’t in fact posted how you generate texture coordinates. I do have a couple of things however. Firstly I believe there is a bug in your initialization of glcolorModel:


glColorModel = new ComponentColorModel(ColorSpace.getInstance(ColorSpace.CS_sRGB),
                new int[]{8, 8, 8, 0}, //Last number is a zero, your code shows an 8.
                false,
                false,
                ComponentColorModel.OPAQUE,
                DataBuffer.TYPE_BYTE);

Second, you use a synchronized method with an if statement to get the instance. Why not use a static initializer?


static {
    instance = new TextureLoader();
}

Although probably better to make the whole class static. Anything stopping you? I know it seems a bit odd in java but I honestly have whole packages which are almost entirely static for this stuff.

first of all thank you!
i changed it ;D

so, here’s how i generate the coords:
An IntBuffer for the vertices:

IntBuffer vertexBuffer = BufferUtils.createIntBuffer(8);

initializing:

vertexBuffer.put(new int[] {0, 0,
                                      texture.getWidth(), 0,
                                      texture.getWidth(), texture.getHeight(),
                                      0, texture.getHeight()});
        vertexBuffer.flip();

texture is my texture (who knew that), which holds the height and width (64x64 in my case; when rendering it is about 40x40).

and the texture buffer:

FloatBuffer textureBuffer = BufferUtils.createFloatBuffer(8);

initializing:

textureBuffer.put(new float[] {0, 0,
                                      1, 0,
                                      1, 1,
                                      0, 1});
        textureBuffer.flip();

maybe it might help:

Image tile = availableTiles[tileLayer1[i][j]-1];
tile.draw((int)(i*tileSize + offsetX), (int)(j*tileSize + offsetY));

It’s how a tile is drawn.
Image is a class which holds the texture + buffers and draws the image.

So how are you making it 40 x 40 when the texture coordinates are clearly 0 - 1?

as far as I can see, that should be:


new float[]{
    0, 0, 
    40/64, 0,
    40/64, 40/64,
    0, 40/64
    }

Note: 40/64 = 0 since it’s an integer division. 40f/64 gives you a float division which is what you want in this case.

Thank you, you’re quite right. Will I ever remember to include that? Doubtful.

Thank you for the answer!

40f/60 will just cut the image.
But I tried it: the image was cut and there was the problem again (wtf).

i have an 64x64 image that’s scaled down to 40x40.

I don’t understand how you are “scaling it down.” The sizes you put into the glTexImage2D have to be 64 and 64, the texture coords you’re using is 1 and 1 so are you using a texture matrix or what? None of the code you have posted could scale a 64x64 image to a 40x40 image.

Well…
I’ve 2 ways to initialize a image.

  1. way: use the textures default width and height.
  2. way: give 2 additional parameters (width and height).
    so instead of adding texture.getWidth() to the vertex buffer I add the width.

so i initialize a image like: new Image(“path”, sizex, sizey)

that’ how a image is “scaled”.

The size of the square you’re drawing has nothing to do with the area of the texture used. The texture is stretched or shrunk to fit the vertex coordinates. It is the texture coordinates which specify what area of the texture to draw.

I want to draw 100% of my texture (the whole texture).
The only thing is that the texture has an destination size of 40x40 (but 100% of the textures area in the destination squad, just scaled).
do the texture coords also has to be scaled down?
as said before:

[quote]The texture is stretched or shrunk to fit the vertex coordinates.
[/quote]
I’m not as advanced as I want to be in opengl.
But I’m working on it. ;D

No the texture is size 64 x 64:

The next power of two up from 40 is 64. So the size of the texture is 64x64 but you only want the 40 x 40 square int the top left hand corner. (It’s the top since OpenGL stores textures from the top downward) Hence your texture coordinates should be:


{
    0, 1 - (40f/64),
    1, 1 - (40f/64),
    1, 1,
    0, 1
}

I realize now that I got this wrong before. Sorry. I always make sure the textures I load are pot anyway so it’s not something I’m used to.
Your vertex coords can be whatever you want, they have no effect on what area of the texture is drawn, only where it is drawn on the screen.

you’re my hero!
Thank you very much! :slight_smile: