After reading a post on the HN frontpage from amanvir.com about dithering, I decided to join in on the fun. Here’s my attempt at implementing Atkinson dithering with support for colour palettes and correct linearisation.

Dithering into arbitrary palettes

The linked post from Aman does an excellent job of explaining dithering into a black and white palette using Atkinson Dithering. I can also recommend surma.dev’s post, he explains more than just error diffusion (for example ordered dithering).

However both of them convert their input images to grayscale before dithering. If the sum of the pixel and the accumulated error is lighter than the threshold, they pin it to pure white, otherwise to pure black: colour = 255 if colour >= 127 else 0.

But why restrict ourselves to monochromatic palettes? Instead of converting the image to grayscale before dithering, we could use any palette!

Albrecht Dürer painting dithered in RGB, CMYK and a Gameboy-like palette.

To dither into “black and white”, we simply compared the scalar value of the pixel to a threshold. If we want to work with colours, we will have to account for all channels (red, green and blue values of the pixel). Instead of a simple comparison between two scalars, we have to find the closest colour 3d (colour) space.

For each distinct colour in the palette, the distance to the pixel’s colour is computed using euclidean distance. We also accumulate the error for each colour channel individually, similar to what is done in monochrome error diffusion dithering.

Distance in 3d colour space.

If you want to play with dithering and different palettes yourself, check out ditherit.com, which has a pretty nice web interface.

Linearising

We have just committed a mortal sin of image processing. I didn’t notice it, you might not have noticed either, but colour-space enthusiasts will be knocking on your door shortly.

First, we failed to linearise the sRGB input image, which results in overly bright dithered outputs. And second, we didn’t take into account human perception, as red is seen as brighter than green for example.

Images are usually stored in the sRGB colour space, which is tailored to CRT’s. The issue arises when we want to quantitatively compare brightness in sRGB. Because it’s not a linear colour space, the difference in brightness going from 10 to 20 is not the same as from 100 to 110, for example.

Dithering a black-to-white gradient will be wrong without linearising first.

This means that dithering in sRGB directly will produce results that are too bright. Before dithering, we need to linearise - convert to a linear colour-space. We can now compare brightness directly and get a correct result. At the end, we convert back to sRGB and get a correct result. Surma explains linearisation pretty well and you should also check out this stackoverflow answer, which is very thorough.

If we also want to take human perception into account, we need to assign different weights to each channel. By scaling the colours before comparing, we preserve perceptual luminance. The linked Wikipedia post lists the following values: 0.2126 R + 0.7152 G + 0.0722 B.

As far as I can evaluate it, ditherit.com doesn’t seem to linearise. If you want to play with a correct implementation, there is the dither library and the corresponding command line utility didder from makew0rld. Check out the authors explanation about linearisation on his blog.

Aesthetics

Dithering doesn’t only serve an aesthetic purpose of course, and in cases where it is used for lossy compression, colour correctness is of utmost importance. But in a more artistic context, I personally think that sometimes, non-linearised dithering looks better.

Gradient dithered with and without linearising.

To me, the image in the middle (not linearised) looks to bright, the right image too dark (created with didder, colour-correct).

Comparison of my non-linearised dithering and didder’s correct code on Dürer’s “Young Hare”.

Both didder’s correct and linearised dithering and the non-linearised version look off to me. Maybe my monitor is not well calibrated or I am using didder wrong, but I just prefer the non-linearised result.

I am not implying that didder is wrong, just that I personally prefer the brighter results. If you want to play with my python implementation, check it out on GitHub.


This has become more of a link collection than a post. But I hope that someone finds it helpful to have all resources and a basic explanation in one place… If you know more than me about colours and noticed any errors, please reach out!