Digital color is a mess
Design · Written on 28 Jan 2026

Color isn't a property of light. It's a perceptual phenomenon. Electromagnetic radiation between 400-700nm hits your retina, three types of cone cells (peaking at ~420nm, ~530nm, and ~560nm) fire signals, and your brain assembles something it calls "red" or "blue." Human perception is wildly non-linear. We're most sensitive to green, least to blue, and brightness perception follows a roughly logarithmic curve.

img
Electromagnetic spectrum

From Indexed Color to RGB

Early displays used indexed color: a lookup table of predefined colors referenced by small index values. As memory got cheaper, we moved to storing full RGB values per pixel. But RGB (255, 0, 0) on one monitor didn't match the same values on another because displays have different color reproduction capabilities.

The International Color Consortium (1993) addressed this with ICC profiles, metadata describing a device's color characteristics. In 1996, Microsoft and HP created sRGB, a standardized color space modeled on typical CRT phosphors. It became the web default and still is.

What Makes a Color Space

A color space needs three things: chromaticity coordinates of the primary colors (plotted on CIE XYZ), a white point (typically D65 at 6500K for digital), and a transfer function. The primaries define your gamut boundaries. The white point sets your color temperature reference. The transfer function (gamma, typically ~2.2) compensates for CRT voltage response and human brightness perception, putting more bit depth into shadows where we notice differences more.

Gamut

Display P3 covers about 25% more volume than sRGB, mostly in reds and greens. ProPhoto RGB is larger still. When wider gamut content hits a narrower display, gamut mapping kicks in. Clipping is fast but loses detail at boundaries. Compression preserves relationships but costs more. Most software just clips.

img
Color Gamut

Models vs. Spaces

RGB is a Cartesian cube. HSL/HSV use cylindrical coordinates with hue as angle, saturation as radius, lightness/value as height. These are different ways to address the same colors, not different gamuts. It matters for gradients: RGB interpolation cuts straight through the cube and often hits muddy desaturated midpoints. Polar models can walk around the hue axis instead.

Perceptual Uniformity

CIE LAB (1976) tried to make equal numerical steps produce equal perceived differences, modeling opponent color theory with L for lightness, a for red-green, b for yellow-blue. Oklab (2020) does this better with improved hue linearity and cheaper computation. Oklch is oklab in cylindrical form and shows up more in CSS now. You still have to clamp values to your target gamut or you'll specify colors that don't exist on the display.

Bit Depth

The "16.7 million colors" number comes from 8 bits per channel (256³). At 10 bits you get over a billion. HDR needs 10+ bits. Professional work uses 16-bit for headroom. Higher bit depth prevents banding but costs storage and processing. JPEG's 8-bit ceiling is why heavy edits degrade quality: you're quantizing already-quantized data.

Color Management

Your OS composites content from multiple color spaces 60+ times per second. Applications embed ICC profiles; the Color Management Module converts through a Profile Connection Space (usually CIE XYZ) to your display's profile. When it works, colors stay consistent. When it fails, usually from missing profiles or untagged content assumed to be sRGB, things shift unpredictably.

Know your target gamut, work in appropriate bit depth, tag your profiles. The rest is plumbing.