Converting RGB images to a master palette

Hi, so I have a master palette of 256 colors. I want to convert a loaded RGB image to use this palette. Essentially, clamping the colors in the RGB image to the closest color it can find in the palette. I’ve tried for a few hours now to brute force it but it takes too long and produces vastly incorrect results. Does anyone have a solution?

did you try hashing yet ? i mean something like [icode]int idx = int(luminance(color.rgb) * 255)[/icode] - for a gray-scale ramp.

anyway, i guess after brute force some sort of hashing would do the trick.

There are different solutions depending on if the palette is fixed or flexible.

You can do pretty well by taking a color distance metric and selecting the nearest palette color. Keep track
of the RGB error and add it into the next pixel, so the average selected pixel stays near the original RGB colors.

If you can choose your palette, then histogram your colors, and use a peano curve to walk the histogram
to group histogram colors to select palette colors, and make the palette colors conform to the centroid of
the corresponding histogram boxes. This can produce magically good pictures even with very few colors.

Of course, doing all this efficiently is difficult to do yourself, and I have no idea what commercial color reduction
programs will work well for you. Photoshop is not bad.

I guess a reverse lookup table would kinda work. If you have 256256256 different colors possible, you can simply create a table for all those combinations (16MB). I guess doing some kind of approximation to reduce the size of the table would be a good idea though. xd

^ Yes that would also reduce lookup misses a lot so fewer times to calculate colour distance. And if the image is getting reduced to 256 colours anyway it sounds like it could stand some simple posterizing first, like truncate the R, G, B each to the nearest multiple of 8 or 16.

The OP question said that brute forcing wasn’t working and producing incorrect results. I was just curious what the brute forcing consisted of? Also how accurate the results have to be.

Thanks everyone :slight_smile: richierich, the brute force version was when I converted all the RGB colors from the image to LAB format since I read that that format makes distance calculations simple and correct.

I’ve been experimenting with color distances a while back since we are using some kind of master palette ourselves. Right now we don’t use them, but do exact RGB to palette matching (meaning, color distance must be == 0 all the time) and output warnings/errors if no match is found by choice of our pixel artists (they produce art using our palette, and want to see if they are using a color not available in the palette), but I kept the utility functions around just in case.

I tried using both RGB and HSV color spaces. In our case, results were pretty much the same. Of course, the max distance depends a lot on the palette you use.

Also, I didn’t invent these formulas, but I don’t remember the references right now.


	// note: returns the square distance, avoiding the sqrt()
	public static float distanceSqrRGB(float r0, float g0, float b0, float r1, float g1, float b1) {
		float r = 0.30f * (r0 - r1);
		float g = 0.59f * (g0 - g1);
		float b = 0.11f * (b0 - b1);

		return r * r + g * g + b * b;
	}

	public static float distanceHSV(float r0, float g0, float b0, float r1, float g1, float b1) {
		float avgHue = Math.abs(r0 - (r0 + r1) * 0.5f);
		float avgSaturation = Math.abs(g0 - (g0 + g1) * 0.5f);
		float avgValue = Math.abs(b0 - (b0 + b1) * 0.5f);

		return avgHue * 0.4750f + avgSaturation * 0.2875f + avgValue * 0.2375f;
	}

Thank you! This will help a ton!

These magic numbers are fascinating! I’ve been reading Wikipedia about colour spaces for a couple of hours and still don’t quite understand what’s going on.

I thought humans were particularly sensitive to colours near to green (ancestors living in trees), so would have expected changes in the G channel of RGB to be especially noticeable, if anything. But the weightings in this distance formula show the opposite, with G needing to be magnified by a factor of more than 5 compared to the B to create the equivalent apparent distance. Atm my best attempt to understand this is that in addition to mid-frequency sensitivity our eyes are even more sensitive to changes in brightness, and especially at the dark end (predators come out at night!) So the green component of RGB, featuring disproportionately in lighter colours (middle of the spectrum) actually contains less difference information because darker colours are the easiest to distinguish. The diagrams on Wikipedia show a huge area of green chromaticity (in LAB terms) getting squashed down into the green corner of the RGB model, which seems to support that but I really don’t know! :smiley:

Re the function gut feeling is it might be faster if it could somehow all be done with the integer representations of the RGB, but who knows what the profiler would say. Anyway maybe performance isn’t really an issue now if at least you’re getting accurate results?

Edit: theagentd’s gamma correction article&tutorial is interesting and relevant. It seems to have no appreciate button so I’ll appreciate here :slight_smile:

Don’t forget you can use dithering to approximate colors (in groups of pixels).

Here I created code to approximate any RGB input in the EGA16 palette (obviously only 16 colors)

If you add animation, you can dither in the 3rd dimension (time).


I remind you, these are just 16 colors, evenly spaced out colors in the RGB color space, which means I probably use only 8-12 per frame in this specific image.

Just run this JAR for yourself:
http://indiespot.net/files/ega-dithering.jar

It should be relatively easy to use this concept to match an arbitrary palette.

Whoa! That looks awesome! Definitely will look into this. Thanks!

EDIT: One thing I can’t make my mind up about is the following:

Should I make a master palette for the engine or allow each texture to have its own palette?
Keep in mind that this palette is a 256x64 table of 256 hues and 64 shades of the hue.

It’s essentially a tradeoff between memory and color depth.

Why is memory such a constraint?

Just for memory locality. I’d rather have the texture mapper point to one area in memory rather than a bunch of different areas haha.
By the way, I implemented your thing (without dithering) with a 256*64 color table :). I ended up using a master palette and the results are surprisingly not that bad.

Up next is to do gamma correction :slight_smile: Thank you so much Riven!