The rainbow is dead…long live the rainbow! – The rainbow is dead…long live the rainbow! – Perceptual palettes, part 3

Inroduction

Following the first post in this series, Steve commented:

Matteo, so would I be correct in assuming that the false structures that we see in the rainbow palette are caused by inflection points in the brightness? I always assumed that the lineations we pick out are caused by our flawed color perception but it looks from your examples that they are occurring where brightness changes slope. Interesting.

As I mention in my brief reply to the reader’s comment, I’ve done some reading and more experiments to try to understand better the reasons behind the artifacts in the rainbow, and I am happy to share my conclusions. This is also a perfect lead into the rest of the series.

Human vision vs. the rainbow – issue number 1

I think there are two issues that make us see the rainbow the way we see it; they are connected but more easily examined separately. The first one is that we humans perceive some colors as lighter (for example green) and some as darker (for example blue) at a given light level, which is because of the difference in the fundamental color response of the human eye for red, green, and blue (the curves describing the responses are called discrimination curves).

There is a well written explanation for the phenomenon on this website (and you can find here color matching functions similar to those used there to create the diagram). The difference in the sensitivity of our cones explains why in the ROYGBIV color palette (from the second post in this series) the violet and blue appear to us darker than red, and red in turn darker than green and yellow. The principle … applies also to mixes involving the various cones (colours), hence the natural brightness of yellow which stimulates the two most reactive sets of cones in the eye. We could call this a flaw in color perception (I am not certain of what the evolutionary advantage might be), which is responsible for the erratic appearance of the lightness (L*) plot for the palette shown below (If you would like to know more about this plot and get the code to make it to evaluate color palettes, please read the first post in this series).

So to answer Steve, I think yes, the lineations we pick in the rainbow are caused by inflection points in the lightness profile, but those in turn are caused by the differences in color responses of our cones. But there’s more!

Human vision vs. the rainbow – issue number 2

The second issue with human vision is that our ability to perceive CHANGES in hue is also variable, depending on the wavelength. This is illustrated by the hue discrimination curve shown in the figure below, which compares wavelength of light with the smallest observable difference in hue (expressed as wavelength difference). The figure is from Gregory’s 1964 book Eye and Brain (which, by the way, is a wonderful read and I highly recommend it to those that are interested in human vision, together with The Vision Revolution by Mark Changizi).

There is a ‘digital era’ version of the curve by Dawson in Figure 13 of this color perception review, and some very interesting material here, but I prefer to quote directly from Gregory. According to him the curve in the figure shows that the hue discrimination is … smallest – best possible hue discrimination – where the response curves have their steepest opposite (one going up, the other down) slopes … [and] … we should thus expect hue discrimination to be exceptionally good around yellow – and indeed this is so’. 

That’s it! It is all in this one paragraph. For example, yellow is such a hard edge because not only it is the lightest color (issue 1), but also because it is the one for which we can see changes more easily (issue 2), so when color palettes are built by interpolating linearly between hues it makes things worse.

Finding out about this discrimination curve also gave me an insight into a possible solution to correct the rainbow. It pushed me to ask: “can I use the curve to dynamically stretch the rainbow where transitions are too sharp (more evidently around the yellow, green, and blue), compared to everywhere else?” And the answer was: si se puede! It can be done, with a correction function calculated directly from the discrimination curve, and this is how I did it.

The top panel in the figure below is again the ROYGBIV rainbow color palette from the second post of the series. The second panel is a plot of the lightness L* corresponding to each sample in the color palette (x is sample number). In the third panel I reproduced Gregory’s wavelength discrimination curve (x is wavelength in nm). Notice that there are three major changes of gradient in the lightness profile and those correspond to three highs and lows in hue discrimination curve. It’s this consideration that brought the Eureka moment.

The fourth panel is my correction function, which is essentially an inverted and rescaled version of the discrimination curve. I used the function to resample the color palette at varying, non-integer sampling rate with up to 1.5 sample/nm in the yellow area, unmodified at 1 in the blue area, less than 1 everywhere else, with the total number of samples remaining 256. This resulted, for example, in a far greater number of samples around the yellow compared to the other areas. The next step was to force these new samples back to a rate of 1 sample/nm, achieving a continuously dynamic stretch and squeeze. This caused a broadening of the yellow area and eliminated the sharp edge, as can be observed in the resulting color palette in the fifth panel (please notice that this palette is not any more to scale with wavelength).

Fist impressions: the palette looks better in the area between the green, the yellow, and the red. The sharp edge at the yellow is gone, as mentioned, and the green area is less isoluminant. Now let’s look at the lightness L* profile for this palette which is in the last panel. This is definitely a more perceptual profile in the said area with smoother, gentler transitions and a compressive character. To me this is a very good result in principle, even though it’s not perfect in practice. For one, we’re now using a lot of the L* contrast between blue and green, and also the yellow is not an edge any more but at the cost of a loss of contrast (and from the feedback I got on a Matlab forum this was exacerbated for viewers with color vision deficiencies).

Part of the problem might be that human wavelength discrimination curves are empirical, and there are many (you can find them in Wyszecki and Stiles), so none really gave peaks and troughs in the correction function that fit perfectly with the edges in the L* profile of ROYGBIV. The function derived from Gregory’s curve is the one that gave me the best result however. Perhaps I could reduce a bit the amount of stretching and squeezing, say constrain it to something like between 0.6 and 1.2.

But part of the problem is that my idea was only ever going to address issue 2. After all my experiments I am now convinced issue 1 with ROYGBIV is insurmountable: we certainly can’t make red lighter than yellow. And we can’t make blue a bit less darker than green. Or can we? An idea started to form in my mind at this point. What if I tried to fit a straight line, monotonically increasing from low L* values to high L* values, at each L* assigning from scratch a hues with that particular lightness?

I will describe my efforts to produce a new, perceptual rainbow palette based on this premise in the last part of this series. But prior to that, in the next two short posts, I will discuss two really good perceptual palettes that are already available.

*** 2019 UPDATE ***

After reading Peter Kovesi’s awesome paper – Good Colour Maps: How to Design Them – and trying his similar Matlab function equalisecolormapdescribed here, I realized with my method above I was really close to a solution, and managed to get it to work; in Python, this time. You can read about the improved method in

🙂

Here’s a preview of how the new plots look (in this case I am using nipy_spectal as a test:

And below I show the results on an XYZ surface using jet in the top row, and nipy_spectal in the bottom row.

And, of course, if you want the code to try the equalization, get the noteboook on GitHub.

***************************

 

Related posts (MyCarta)

The rainbow is dead…long live the rainbow! – the full series

What is a colour space? reblogged from Colour Chat

Color Use Guidelines for Mapping and Visualization

A rainbow for everyone

Is Indigo really a colour of the rainbow?

Why is the hue circle circular at all?

A good divergent color palette for Matlab

Related topics (external)

Color in scientific visualization

The dangers of default disdain

Color tools

How to avoid equidistant HSV colors

Non-uniform gradient creator

Colormap tool

Color Oracle – color vision deficiency simulation – stand alone (Window, Mac and Linux)

Dichromacy –  color vision deficiency simulation – open source plugin for ImageJ

Vischeck – color vision deficiency simulation – plugin for ImageJ and Photoshop (Windows and Linux)

For teachers

NASA’s teaching resources for grades 6-9: What’s the Frequency, Roy G. Biv?

References

G. Wyszecki and W.S. Stiles Color Science: Concepts and Methods, Quantitative Data and Formulae, 2 edition, Wiley-Interscience