How to deal with max. number of loaded textures

hey,

i just realized that there seems to be an upper limit of textures which can be loaded and that limit is different on different graphics cards.
it ranges from 4 to 36 on the computers where i tested my game. since i never really had the time (or the brains) to dig deeper into lwjgl/opengl it’s very likely that i’m doing things not the right way. so i wonder if you opengl gurus can give me a hint on that. do i really have to live with that limitation or is it likely that there’s some bug in my code? if this is a hard limit i wonder how others can create such amazing games with tons of images on screen at the same time…

here’s my exception that i get when i try to load too many textures:

org.lwjgl.opengl.OpenGLException: Invalid value (1281)
	at org.lwjgl.opengl.Util.checkGLError(Util.java:56)
	at org.lwjgl.opengl.Display.swapBuffers(Display.java:555)
	at org.lwjgl.opengl.Display.update(Display.java:571)
...

it disappears when i reduce the number.

i will post my code later (if needed) but first, i’d like to know you guy’s opinions.
thanks a lot for your help!

You can have as much textures as the card has memory. The limit you have run into is the number of texturing units on the card, not the number of textures that the card can handle. This means, that you can use 4-32 textures to color one object, but you can use different textures to color different objects. The solution is to load/unload textures into texturing units during the rendering as you switch to render different objects. This does not mean loading them from the disk again, the textures are resident in the graphics card memory, just are not used for current rendering.

If you are just starting in OpenGL and LWJGL, i recommend you to use jMonkeyEngine, this issue (and allot more) is already handled by the engine for you.

Your first step would be finding out what gl call is causing problems. Find a copy of the Red Book and read the section about error reporting and glGetError. Then scatter calls to Util.checkGLError around your drawing code to figure out what call is failing. Then you’ll be able to look up the function in the docs and it’ll tell you what circumstances it can trigger “invalid value”.

At a guess you’re passing in an invalid enum to a gl function, but without seeing the code and know which that’s about as much as we can tell you.

@OrangyTang: when i reduce the number of images/textures which i load, i don’t get the error. i don’t think it’s a wrong value which is passed in. additionally i have no problems on my pc with a gamer graphics card. the error only appears on my laptop with weak graphics.
but thanks for the hint with glGetError.

@VeaR: i’m not sure if i get you right. are you saying i can reach a higher number of textures in total but i would still not be able to display more than 4-32 images at the same time (in on game scene)?

@OrangyTang: i have to apologize. you were right. i had a texture which had the dimensions 15x16 pixels. it’s strange that this doesn’t pose a problem on my pc. thanks again for the hint with glGetError()!!

You can have unlimited number textures visible at the same time, but you cannot have unlimited number of textures active at the same time.

phew, that’s good to know.
thanks for the info!

Textures usually have to have dimensions that are power-of-two (eg. 16x64, 128x256). Better cards support extensions that remove this restriction. Either stick to using power-of-two textures or check for GL_ARB_texture_non_power_of_two and handle it gracefully if it’s not present.