Color quantization reduces the number of distinct colors in an image. Why? Traditionally, computers had limited color depth, so they needed to reduce the number of colors in an image. Limited color depth is still useful for image compression on certain image formats (GIF, PNG). GIFs are limited to 256 colors, while PNG images support 24-bit color, but can often be made much smaller in filesize without much visual degradation by application of color quantization, since PNG files use fewer bits per pixel for palettized images.
Indexed Color
Indexed color is a technique in computer graphics where each pixel in an image doesn't store full color information (like RGB). Instead, it stores a reference (an index) into a color palette, which is a list of predefined colors.
- Memory Efficiency: Reduces file size, especially useful in early computers or limited hardware.
- Palette Control: Artists can fine-tune a small set of colors for aesthetic or technical reasons (e.g., pixel art, GIFs).
- Smaller Images: If you only need 256 colors, you can use just 8 bits per pixel (2⁸ = 256), instead of 24-bit RGB (8 per channel).
Common Formats That Use Indexed Color:
- GIF: Limited to 256-color palettes.
- BMP (with 8-bit depth).
- PNG-8: A variant of PNG using a palette instead of full RGBA.
Note: Not all quantization methods use indexed color - sometimes they simply reduce the color set or bit-depth
Palettes
A palette is just a finite list of colors. Each color is typically stored as an RGB triplet (or sometimes in other color spaces), and the palette acts as a reference table.
Instead of storing full color values for each pixel, you store an index into the palette. Why? To save memory! A pixel stores just the index into a palette (i.e. 8 bits for a 256 color palette). The full color is then stored in a palette table.
Types of Palettes
- Fixed Palettes - Predefined and constant.
Example: CGA, NES, Windows 16-color palette. List of Color Palettes
- Adaptive Palettes
Generated dynamically from the image using quantization algorithms (like median cut or k-means). Used for GIF encoding, image compression, etc.
- User-Defined Palettes
Set manually by an artist. Used in pixel art, sprite sheets, and demoscene visuals.
Discrete Quantization
TODO: just chop down the bits, effecively bucketing each color to a evenly spaced range
uniform scalar quantizer with midpoint rounding:
Simple Quantization: Fixed Palette / Euclidean Distance
Type: Fixed Palette
Given a fixed palette, we can quantize an image by finding the nearest color in the palette for each pixel. "Nearest color" is usually calculated using "straight-line distance" or "Euclidean distance". Basically... snap to color!
function quantizeImage(imageData, palette) {
const output = new ImageData(imageData.width, imageData.height)
const data = imageData.data
for (let i = 0; i < data.length; i += 4) {
const pixel = [data[i], data[i + 1], data[i + 2]]
// Find closest color in palette
let minDistance = Infinity
let closestColor = null
for (const color of palette) {
const dr = pixel[0] - color[0]
const dg = pixel[1] - color[1]
const db = pixel[2] - color[2]
const distance = Math.sqrt(dr * dr + dg * dg + db * db)
if (distance < minDistance) {
minDistance = distance
closestColor = color
}
}
output.data[i] = closestColor[0]
output.data[i + 1] = closestColor[1]
output.data[i + 2] = closestColor[2]
output.data[i + 3] = 255
}
return output
}
Median Cut
type: adaptive palette
How it works: Recursively splits the color space by median value in the widest channel until you get k color buckets.
- Pros: Fast and simple; used in GIF and PNG8.
- Cons: Can produce harsh edges or unnatural colors.
First we start by creating a ColorBox
which represents a group of pixels in RGB space. You can think of it like a 3D box surrounding a cluster of colors. We start with one box containing all the pixels in the image, then split it recursively based on color channel ranges until we get k color buckets.
class ColorBox {
constructor(pixels) {
this.pixels = pixels
this.updateRanges()
}
// calculate the color range for this box
updateRanges() {
let rMin = 255,
rMax = 0
let gMin = 255,
gMax = 0
let bMin = 255,
bMax = 0
for (const [r, g, b] of this.pixels) {
rMin = Math.min(rMin, r)
gMin = Math.min(gMin, g)
bMin = Math.min(bMin, b)
rMax = Math.max(rMax, r)
gMax = Math.max(gMax, g)
bMax = Math.max(bMax, b)
}
this.rRange = rMax - rMin
this.gRange = gMax - gMin
this.bRange = bMax - bMin
}
// determine which channel to split on - the one with the widest range
getSplitChannel() {
const maxRange = Math.max(this.rRange, this.gRange, this.bRange)
if (maxRange === this.rRange) return 0 // red
if (maxRange === this.gRange) return 1 // green
return 2 // blue
}
// split the box into two new boxes at the median value in the widest channel
split() {
const channel = this.getSplitChannel()
this.pixels.sort((a, b) => a[channel] - b[channel])
const mid = Math.floor(this.pixels.length / 2)
// Use the constructor of the current class (either ColorBox or RareColorBox)
const BoxClass = this.constructor
return [new BoxClass(this.pixels.slice(0, mid)), new BoxClass(this.pixels.slice(mid))]
}
getAverageColor() {
const sum = [0, 0, 0]
for (const [r, g, b] of this.pixels) {
sum[0] += r
sum[1] += g
sum[2] += b
}
const len = this.pixels.length
return sum.map((x) => Math.round(x / len))
}
}
function medianCutQuantization(pixels, paletteSize) {
// start with one box containing all the pixels
let boxes = [new ColorBox(pixels)]
while (boxes.length < paletteSize) {
// Find box with most colors (or widest range)
boxes.sort((a, b) => Math.max(b.rRange, b.gRange, b.bRange) - Math.max(a.rRange, a.gRange, a.bRange))
const boxToSplit = boxes.shift()
const [box1, box2] = boxToSplit.split()
boxes.push(box1, box2)
}
return boxes.map((box) => box.getAverageColor())
}
K-Means Clustering
type: adaptive palette
How it works: Clusters pixel colors into k groups using Euclidean distance, then assigns each pixel to the nearest cluster center.
- Pros: High-quality quantization; works with any number of colors.
- Cons: Slower; not always perceptually optimal.
TODO: try this out
Extra Credit: Greyscale Palette Mapping
(has an extra step from paletization due to greyscale bucketing)
We can create stylized images by first reducing the image to a small number of greyscale levels, then mapping those levels to a custom palette. This technique was popular in early computer art and printmaking.
function greyscaleQuantize(imageData, levels) {
const output = new ImageData(imageData.width, imageData.height)
const data = imageData.data
// Calculate the step size for each level
const step = 255 / (levels - 1)
for (let i = 0; i < data.length; i += 4) {
// Convert to greyscale using luminance formula
const r = data[i]
const g = data[i + 1]
const b = data[i + 2]
const grey = Math.round(r * 0.299 + g * 0.587 + b * 0.114)
// Quantize to nearest level
const level = Math.round(grey / step) * step
// Map to output
output.data[i] = level
output.data[i + 1] = level
output.data[i + 2] = level
output.data[i + 3] = 255
}
return output
}
function mapToPalette(imageData, palette) {
const output = new ImageData(imageData.width, imageData.height)
const data = imageData.data
const levels = palette.length
// Calculate the step size for mapping
const step = 255 / (levels - 1)
for (let i = 0; i < data.length; i += 4) {
const grey = data[i] // All channels are the same in greyscale
const level = Math.min(Math.floor(grey / step), levels - 1)
const color = palette[level]
output.data[i] = color[0]
output.data[i + 1] = color[1]
output.data[i + 2] = color[2]
output.data[i + 3] = 255
}
return output
}
This technique allows for creative control over the final image by:
- Controlling the number of greyscale levels (more levels = more detail)
- Choosing a custom palette that can create different moods or styles
- The mapping process preserves the image structure while applying the new color scheme