Limited Palettes

Hey all! Haven’t posted in a little bit.

Basically, I’ve made the switch to libgdx and I enjoy it quite a bit. I also recently read up on my pixel art theory and the like and have grown fascinated with the idea of using limited color palettes. In particular, this resource has made me realize what I’ve been looking for for quite some time:

http://www.pixeljoint.com/forum/forum_posts.asp?TID=12795

My question is–in addition to the fact that you maintain visual consistency throughout your piece and lend a (perhaps subjectively) nice element of aesthetic pleasure to your pixel art–is there a performance gain to be had with using a limited color palette? The above work shows the use of 16 colors, which is frankly quite amazing and works pretty well, but I’m thinking more along the lines of pursuing 256 colors after doing some posterizing tests and such with GIMP on some game art I’ve picked off Google. I’m still learning a lot about graphics but want to know if having a limited palette could lend a boost in performance through libgdx, or least OpenGL, more specifically. I apologize for my naivete! Thank you in advance for your help and suggestions.

Colton

If you used a different colour format than RGBA8, then maybe it will take up less RAM. But otherwise I don’t think so.

I see what you mean. I know the Color class in libgdx can set its integer representations to 16, 24, and 32-bit, but I don’t see any 8-bit options. I’m just trying to think if it would be worth settings something like that up by hand for any sort of gain. I can’t imagine the gain would be extreme, but it would be nice to be able to take advantage of a 256-color palette in that regard. :]

http://www.opengl.org/wiki/Image_Format

Just found this by using google. Maybe it will help.

Yeah, except from the memory savings, you won’t gain much (if any) performance, as today’s cards and architectures are tuned for 32-bit colors.
And the memory savings will be negligible if you use pixel art. A 32x32 sprite costs nothing, even at 32-bit color depth:

32pixels x 32pixels x 1byte = 1kb for 8-bit (256 colors)
VS
32pixels x 32pixels x 4bytes = 4kb for 32-bit (true colors + alpha).

So I’d say you’ll always be safe with pixel art and OpenGL.
Correctly using sprite sheets will have a lot more impact.

Appreciate it. Didn’t really have what I was looking for, though. I was thinking more along the lines of indexed colors, which I read more about through just the Wikipedia article and was able to learn a bit more about it. I may do some research into it with OpenGL, but arnaud’s information suggests it may not be worth the time.

I see, I see. At least I know where to focus my efforts! ;] Thanks!

Indexed colours only reduce file size. Once the file is loaded, every single pixel will have a 32-bit int for it’s colour, unless you use a different format. Unfortunately that also decreases your available choices before you even make a palette.

But yes, as said, the difference is not worth it.

I was under the impression that was not the case, otherwise it would just make sense to use 32-bit color to begin with. My understanding is the pixels are rendered based on offsets rather than storing their own color data. Here’s an excerpt from the Wikipedia article:

“Indexed color saves a lot of memory, storage space, and transmission time: using truecolor, each pixel needs 24 bits, or 3 bytes. A typical 640×480 VGA resolution truecolor uncompressed image needs 640×480×3 = 921,600 bytes (900 KiB). Limiting the image colors to 256, every pixel needs only 8 bits, or 1 byte each, so the example image now needs only 640×480×1 = 307,200 bytes (300 KiB), plus 256×3 = 768 additional bytes to store the palette map in itself (assuming RGB), approx. one third of the original size. Smaller palettes (4-bit 16 colors, 2-bit 4 colors) can pack the pixels even more (to 1/6 or 1/12), obviously at cost of color accuracy.”

Article for reference: http://en.wikipedia.org/wiki/Indexed_color

Unless I’m misunderstanding what it says?

HeroesGraveDev is correct, he said “unless you use a different format”.
You can tell in what format an image will be stored in graphics memory, when you load it with glTexImage2D for example (google it and you’ll find all the color formats listed).

OpenGL fixed pipeline does have support for palettes and color indices (never used it though, and I dunno how do it with shaders either), but it’s rarely used, because nowadays it’s not worth the hassle. I doubt LibGDX uses it, it probably just tells OpenGL to store all pixels in graphics memory as 16, 24 or 32bit, without using color indices, to keep things simple. You’d have to code directly with OpenGL to use its palette capabilities.

Just work in 32bit color depth, that way:

  • you have all the colors you want
  • you lose no performance, and almost nothing in terms of memory
  • you have alpha
  • you don’t have to worry about anything, it’ll work ™

Limit your palette on the artistic side, when you create your images/sprites, NOT on the technical side, which just brings more troubles than benefits.

I see. That makes sense, especially considering the fact that I would like to be able to incorporate colored box2dlights, shaders, Post-Processing, and whatnot, but it was nice to at least think about it and figure out why it wouldn’t work. Thanks!

It might be interesting to view the Another World 15th anniversary edition Making of video (which is French, but with subtitles), which talks about the subject of using a limited color palette (16 colors) to still create something that breathes atmosphere. That game in particular was a huge success in that department.

part 1: http://www.youtube.com/watch?v=hWBV08FTXFw
part 2: http://www.youtube.com/watch?v=PPDN-tE7vQ0

Awesome, that sounds very interesting! I’ll give it a watch. :] Thanks!

don’t waste your time to store images in a different format than rgba 32 bit. It just makes things more complicated. The performance improvement can be done in most games with the correct usage of modern opengl (no glBegin, glMatrixMode, etc). There was something called color index mode, but it is deprecated, if you want it back you need to implement the color lookup in the fragment shader. Texture compression is something you could use.