GL_TEXTURE_2D Unexpected Invalid Enum

I am having sort of a weird problem.

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, bb);

This code errors with invalid enum. I can verify this code works. I am using OGL 2.1

[quote]OpenGL ran into an error | 1280 | Invalid enum
at main.Main.checkOpenGLError(Main.java:299)
at render.texture.data.TextureLoader.loadTexture(TextureLoader.java:51)
at scene.SceneLoader.loadScene(SceneLoader.java:92)
at main.Main.main(Main.java:154)
[/quote]
I have no idea what is going on. I tried all the different formats, but this silly old clunker won’t accept it. It’s an error reported back from the hardware. I used GL11.glGetError();

I am not using anything fixed function pipeline also.

[quote]GL_INVALID_ENUM is generated if target is not GL_TEXTURE_2D,
GL_PROXY_TEXTURE_2D,
GL_PROXY_TEXTURE_CUBE_MAP,
GL_TEXTURE_CUBE_MAP_POSITIVE_X,
GL_TEXTURE_CUBE_MAP_NEGATIVE_X,
GL_TEXTURE_CUBE_MAP_POSITIVE_Y,
GL_TEXTURE_CUBE_MAP_NEGATIVE_Y,
GL_TEXTURE_CUBE_MAP_POSITIVE_Z, or
GL_TEXTURE_CUBE_MAP_NEGATIVE_Z.

		GL_INVALID_ENUM is generated if target is one of the six cube map 2D image targets and the width and height parameters are not equal.
    
        GL_INVALID_ENUM is generated if type is not a type constant.
    
        GL_INVALID_ENUM is generated if type is GL_BITMAP and
        format is not GL_COLOR_INDEX.

[/quote]
Any idea?

Thanks

Try using different formats. Replace both of the GL_RGBA with GL_RGB and see if it works.

You kind of ghosted me. Nonetheless thanks for the reply, ShadedVertex.
I’ve tried everything BUT a different texture. I didn’t realize the error came from a texture whose dimensions were (2295^2)*4.

I just came to update my findings that evidently the texture is too large and generates an error. I put 1024 then 2048 then it maxed out at 2051. Pretty strange number. Nonetheless it works now.