Busting bad colormaps with Python and Panel

I have not done much work with, or written here on the blog about colormaps and perception in quite some time.

Last spring, however, I decided to build a web-based app to show the effects of using a bad colormaps. This stemmed from two needs: first, to further my understanding of Panel, after working through the awesome tutorial by James Bednar, Panel: Dashboards (at PyData Austin 2019); and second, to enable people to explore interactively the effects of bad colormaps on their perception, and consequently the ability to on interpret faults on a 3D seismic horizon.

I introduced the app at the Transform 2020 virtual subsurface conference, organized by Software Underground last June. Please watch the recording of my lightning talk as it explains in detail the machinery behind it.

I am writing this post in part to discuss some changes to the app. Here’s how it looks right now:

The most notable change is the switch from one drop-down selector to two-drop-down selectors, in order to support both the Matplotlib collection and the Colorcet collection of colormaps. Additionally, the app has since been featured in the resource list on the Awesome Panel site, an achievement I am really proud of.

awesome_panel

You can try the app yourself by either running the notebook interactively with Binder, by clicking on the button below:
Binder

or, by copying and pasting this address into your browser:

https://mybinder.org/v2/gh/mycarta/Colormap-distorsions-Panel-app/master?urlpath=%2Fpanel%2FDemonstrate_colormap_distortions_interactive_Panel

Let’s look at a couple of examples of insights I gained from using the app. For those that jumped straight to this example, the top row shows:

  • the horizon, plotted using the benchmark grayscale colormap, on the left
  • the horizon intensity, derived using skimage.color.rgb2gray, in the middle
  • the Sobel edges detected on the intensity, on the right

and the bottom row,  shows:

  • the horizon, plotted using the Matplotlib gist_rainbow colormap, on the left
  • the intensity of the colormapped, in the middle. This is possible thanks to a function that makes a figure (but does not display it), plots the horizon with the specified colormap, then saves plot in the canvas to an RGB numpy array
  • the Sobel edges detected on the colormapped intensity, on the right

I think the effects of this colormaps are already apparent when comparing the bottom left plot to the top left plot. However, simulating perception can be quite revealing for those that have not considered these effects before. The intensity in the bottom middle plot is very washed out in the areas corresponding to green color in the bottom left, and as a result, many of the faults are not visible any more, or only with much difficulty, which is demonstrated by the Sobel edges in the bottom right.

And if you are not quite convinced yet, I have created these hill-shaded maps, using Matt Hall”s delightful function from this notebook (and check his blog post):

Below is another example, using the Colocrcet cet_rainbow which is is one of Peter Kovesi’s perceptually uniform colormaps.  I use many of Peter’s colormaps, but never used this one, because I use my own perceptual rainbow, which does not have  a fully saturated yellow, or a fully saturated red. I think the app demonstrate, that even though they are more subtle , this rainbow still is introducing some artifacts. The yellow colour creates narrow flat bands, visible in the intensity and Sobel plots, and indicated by yellow arrows; the red colour is also bad as usual, causing an artificial decrease in intensity(magenta arrows).

New Horizons truecolor Pluto recolored in Viridis and Inferno

Oh, the new, perceptual MatplotLib colormaps…..

Here’s one stunning, recent Truecolor image of Pluto from the New Horizons mission:

Credit: NASA/JHUAPL/SwRI

Original image: The Rich Color Variations of Pluto. Credit: NASA/JHUAPL/SwRI. Click on the image to view the full feature on New Horizon’s site

Below, I recolored using two of the new colormaps:

colormappedNew_Horizons_Pluto

Recolored images: I like Viridis, by it is Inferno that really brings to life this image, because of its wider hue and lightness range!

 

NASA’s beautiful ‘Planet On Fire’ images and video

Please give credit for this item to: NASA's Goddard Space Flight Center and NASA Center for Climate Simulation Australia photo courtesy of Flagstaffotos

Credits: NASA’s Goddard Space Flight Center and NASA Center for Climate Simulation.

 

NASA_PLANET_ON_FIRE

Click on the image to watch the original video on NASA’s Visualization Explorer site.

Read the full story on NASA’s Visualization Explorer site.

Reinventing the color wheel – part 2

In the first post of this series I argued that we should not build colormaps for azimuth (or phase) data by interpolating linearly between fully saturated hues in RGB or HSL space.

A first step towards the ideal colormap for azimuth data would be to interpolate between isoluminant colours instead. Kindlmann et al. (2002) published isoluminant RGB values for red, yellow, green, cyan, blue, and magenta based on a user study. The code in the next block show how to interpolate between those published colours to get 256-sample R, G, and B arrays (with magenta repeated at both ends), which can then be combined into a isoluminant colormap for azimuth data.

01 r = np.array([0.718, 0.847, 0.527, 0.000, 0.000, 0.316, 0.718])
02 g = np.array([0.000, 0.057, 0.527, 0.592, 0.559, 0.316, 0.000])
03 b = np.array([0.718, 0.057, 0.000, 0.000, 0.559, 0.991, 0.718])
04 x = np.linspace(0,256,7)
05 xnew = np.arange(256)
06 r256 = np.interp(xnew, x, r)
07 g256 = np.interp(xnew, x, g)
08 b256 = np.interp(xnew, x, b)

This is a good example in general of how to interpolate to a finer sampling one or more sequence of values using the interp function from the Numpy library. In line 04 we define 7 evenly spaced points between 0 and 255; this will be the sample coordinate for the r, g, and b colours created in lines 01-03. In line 05 we create the new coordinates we will be interpolating r, g, and b values at in lines 06-08 (all integers between 0 and 256). The full code will come in the Notebook accompanying the last post in this series.

This new colormap is used in the bottom map of the figure below, whereas in the top map we used a conventional HSV azimuth colormap (both maps show the dip azimuth calculated on the Penobscot horizon). The differences are subtle, but with the isoluminant colormap we are guaranteed there are no perceptual artifacts due to the random variations in lightness of the fully saturated HSV colors.

Azimuth_compare

Another possible strategy to create a perceptual colormap for azimuth data would be to set lightness and chroma to constant values in LCH space and interpolate between hues. This is the Matlab colormap I previously created, and shown in Figure 4 of New Matlab isoluminant colormap for azimuth data. In the next post, I will show how to do this in Python.

Read more on colors and seismic data

The last two posts on Agile show you how to corender seismic amplitude and continuity from a time slice using a 2D colormap,  and then how to corender 3 attributes from a horizon slice.

Reference

Kindlmann, E. et al. (2002). Face-based Luminance Matching for Perceptual Colour map Generation – Proceedings of the IEEE conference on Visualization.

Reinventing the color wheel – part 1

In New Matlab isoluminant colormap for azimuth data I showcased a Matlab colormap that I believe is perceptually superior to the conventional, HSV-based colormaps for azimuth data, in that it does not superimposes on the data the color artifacts that plague all rainbows. However, it still has a limitation, which is that the main colours do not correspond exactly to the four compass directions N, E, W, and S.

My intention with this series is to go back to square one, deconstruct the conventional colormaps for azimuth, and build a new one that has all the desired properties of both perceptual linearity, and correct location of the main colors. All reproducible in Python.

If we wanted to build from scratch a colormap for azimuth (or phase) data the main tasks would be to generate a sequence of distinguishable colours at opposite quadrants, or compass directions (like 0 and 180 degrees, or N and S), and to wrap around the sequence with the same colour at the two ends.

But to do that, we should avoid interpolating linearly between fully saturated hues in RGB or HSL space.

To illustrate why, it is useful to look at the figure below. On the left is a hue circle with primary, secondary, and tertiary colours in a counter-clockwise sequence: red, rose, magenta, violet, blue, azure, cyan, aquamarine, electric green, chartreuse, yellow, and orange. The colour chips are placed at evenly spaced angular distances according to their hue (in radians).

hue-wheel-compare

Left, primary, secondary, and tertiary colour chips arranged using hue for angular distance; right, the same colour chips arranged using intensity for angular distance.

This looks familiar and seems like a natural ordering of colors, so we may be tempted in building a colormap, to just take that sequence, wrap it around at the red (or the magenta) and linearly interpolate to 256 colours to get a continuous colormap [1], and use it for azimuth data, which is how usually the conventional azimuth colormaps are built.

On the right side in the figure the chips have been rearranged according to their intensity on a counter-clockwise sequence from 0 to 255 with 0 at three hours; so, for example blue, which is the darkest colour with an intensity of 29, is close to the beginning of the sequence, and yellow, the brightest with an intensity of 225, is close to the end. Notice that the chips are no longer equidistant.

The most striking is that the blue and the yellow chips are more separated than the other chips, and for this reason blue and yellow features seem to stand out a lot more in a map when using this color sequence, which can be both distracting and confusing. A good example is Figure 3 in New Matlab isoluminant colormap for azimuth data.

Also, yellow and red, being two chips apart in the left circle in the figure above, are used to colour azimuths 60 degrees apart, and so do cyan and green. However, if we look at the right circle, we realize that the yellow and red chips are much further apart than the cyan and green chips [2] in the perceptual dimension of intensity; therefore, features colored in yellow and red  could be perceived as much further apart (in azimuth) than cyan and green.

These differences may be subtle, but in my opinion they become important when dip azimuth is combined with other attributes, perhaps using a 3D colormap, and the resulting map is used for detailed structural interpretation. There is a really good example of this type of 3D colormap in Chopra and Marfurt (2007), where dip azimuth is rendered with hue modulation, dip magnitude with saturation modulation, and coherence with lightness modulation.

A code snippet with the main Python commands to generate the two polar scatterplots in the figure is listed, and explained below. The full code can be found in this Jupiter Notebook.

01 import matplotlib.colors as clr
02 keys=['red', '#FF007F', 'magenta', '#7F00FF', 'blue', '#0080FF','cyan', '#00FF80',
'#00FF00', '#7FFF00', 'yellow', '#FF7F00']
03 my_cmap = clr.ListedColormap(keys)
04 x = np.arange(12)
05 color = my_cmap(x)
06 n = 12
07 theta = 2*np.pi*(np.linspace(0,1,13)) 
08 r = np.ones(13)*2.5
09 area = 200*r**2 # size of color chips
10 c = plt.scatter(theta, r, c=color, s=area)
11 theta_i = 2*np.pi*(sorted_intensity/255.0)
12 colors = my_sorted_cmap(np.arange(12))
13 c = plt.scatter(theta_i, r, c=colors, s=area)

In line 01 we import the Colors module from the Matplotlib library, then line 02 creates the desired sequence of colours (red, rose, magenta, violet, blue, azure, cyan, aquamarine, electric green, chartreuse, yellow, and orange) using either the name or Hex code, and line 03 generates the colormap. Then we use lines 04 and 05 to assign colours to the chips in the first scatterplot (left), and lines 06, 07, and 09 to specify the number of chips, the angular distances between chips, and the area of the chips, respectively. Line 10 generates the plot. The modifications in lines 11-14 will result in the scatterplot on the right side in the figure (the sorted intensity is calculated in much the same way as in my Geophysical tutorial – How to evaluate and compare colormaps in Python).

 

[1] Or, perhaps, just create 12 discrete colour classes to group azimuth values in bins of pi/6 (30 degrees) each, and wrap around again at the magenta, to generate a discrete colormap.

[2] The green chip is almost completely covered by the orange chip.

The rainbow is dead…long live the rainbow! – Perceptual palettes, part 5 – CIE Lab linear L* rainbow

Some great examples

After my previous post in this series there was a great discussion on perceptual color palettes with some members of the Worldwide Geophysicists group on LinkedIn. Ian MacLeod shared some really good examples, and uploaded it in here.

HSL linear L rainbow palette

Today I’d like to share a color palette that I really like:

It is one of the palettes introduced in a paper by Kindlmann et al. [1]. The authors created their palettes with a technique they call luminance controlled interpolation. They explain it in this online presentation. However they used different palettes (their isoluminant rainbow, and their heated body) so if you find it confusing I recommend you look at the paper first. Indeed, this is a good read if you are interested in colormap generation techniques; it is one of the papers that encouraged me to develop the methodology for my cube law rainbow, which I will introduce in an upcoming post.

This is how I understand their method to create the palette: they mapped six pure-hue rainbow colors (magenta, blue, cyan, green, yellow, and red) in HSL space, and adjusted the Luminance by changing the HSL Lightness value to ‘match’ that of six control points evenly spaced along the gray scale palette. After that, they interpolated linearly along the L axis between 0 and 1 using the equation presented in the paper.

CIE Lab linear L* rainbow palette

For this post I will try to create a similar palette. In fact, initially I was thinking of just replicating it, so I imported the palette as a screen capture image into Matlab, reduced it to a 256×3 RGB colormap matrix, and converted RGB values to Lab to check its linearity in lightness. Below I am showing the lightness profile, colored by value of L*, and the Great Pyramid of Giza – my usual test surface –  also colored by L* (notice I changed the X axis of both L* plots from sample number to Pyramid elevation to facilitate comparison of the two figures).

Clearly, although the original palette was constructed to be perceptually linear, it is not linear following my import. Notice in particular the notch in the profile in the blue area, at approximately 100 m elevation. This artifact is also visible as a flat-looking blue band in the pyramid.

I have to confess I am not too sure why the palette has this peculiar lightness profile. I suspect this may be because their palette is by construction device dependent (see the paper) so that when I took the screen capture on my monitor I introduced the artifacts.

The only way to know for sure would be to use their software to create the palette, or alternatively write the equation from the paper into Matlab code and create a palette calibrated on my monitor, then compare it to the screen captured one. Perhaps one day I will find the time to do it but having developed my own method to create a perceptual palette my interest in this one became just practical: I wanted to get on with it and use it.

Fixing and testing the palette

Regardless of what the cause might be for this nonlinear L* profile, I decide to fix it and I did it by simply replacing the original profile with a new one, linearly changing between 0.0 and 1.0. Below I am showing the L* plot for this adjusted palette, and the Great Pyramid of Giza, both again colored by value of L*.

The pyramid with the adjusted palette seems better: the blue band is gone, and it looks great. I am ready to try it on a more complex surface. For that I have chosen the digital elevation data for South America available online through the Global Land One-km Base Elevation Project at the National Geophysical Data Center. To load and display the data in Matlab I used the first code snippet in Steve Eddin’s post on the US continental divide  (modified for South America data tiles). Below is the data mapped using the adjusted palette. I really like the result: it’s smooth and it looks right.

South_America_LinearL_solo

But how do I know, really? I mean, once I move away from my perfectly flat pyramid surface, how do I know what to expect, or not expect? In other words, how would I know if an edge I see on the map above is an artifact, or worse, that the palette is not obscuring real edges?

In some cases the answer is simple. Let’s take a look at the four versions of the map in my last figure. The first on the left was generated using th ROYGBIV palette I described in this post. It would be obvious to me, even if I never looked at the L* profile, that the blue areas are darker than the purple areas, giving the map a sort of inverted image look.

South_America_maps_LinearL_rainbow

But how about the second map from the left? For this I used the default rainbow from a popular mapping program. This does not look too bad at first sight. Yes, the yellow is perceived as a bright, sharp edge, and we now know why that is, but other than that it would be hard to tell if there are artifacts. After a second look the whole area away from the Andes is a bit too uniform.

A good way to assess these maps is to use grayscale, which we know is a good perceptual option, as a benchmark. This is the last map on the right. The third map of South America was coloured using my adjusted linear L* palette. This maps looks more similar to our grayscale benchmark. Comparison of the colorbars will also help: the third and fourth are very similar and both look perceptually linear, whereas the third does show flatness in the blue and green areas.

Let me know what you think of these examples. And as usual, you are welcome to use the palette in your work. You can download it here.

UPDATE

With my following post, Comparing color palettes, I introduced my new method to compare palettes with ImageJ and the 3D color inspector plugin. Here below are the recorded 3D animations of the initial and adjusted palettes respectively. In 3D it is easier to see there is an area of flat L* between the dark purple and dark blue in the initial color palette. The adjusted color palette instead monotonically spirals upwards.

References

[1] Kindlmann, G. Reinhard, E. and Creem, S., 2002, Face-based Luminance Matching for Perceptual Colormap Generation, IEEE – Proceedings of the conference on Visualization ’02

Related posts (MyCarta)

The rainbow is dead…long live the rainbow! – the full series

What is a colour space? reblogged from Colour Chat

Color Use Guidelines for Mapping and Visualization

A rainbow for everyone

Is Indigo really a colour of the rainbow?

Why is the hue circle circular at all?

A good divergent color palette for Matlab

Related topics (external)

Color in scientific visualization

The dangers of default disdain

Color tools

How to avoid equidistant HSV colors

Non-uniform gradient creator

Colormap tool

Color Oracle – color vision deficiency simulation – stand alone (Window, Mac and Linux)

Dichromacy –  color vision deficiency simulation – open source plugin for ImageJ

Vischeck – color vision deficiency simulation – plugin for ImageJ and Photoshop (Windows and Linux)

For teachers

NASA’s teaching resources for grades 6-9: What’s the Frequency, Roy G. Biv?

ImageJ and 3D Color inspector plugin

http://rsbweb.nih.gov/ij/docs/concepts.html

http://rsb.info.nih.gov/ij/plugins/color-inspector.html

The rainbow is dead…long live the rainbow! – series outline

The rainbow is dead…long live the rainbow! – Part 1

The rainbow is dead…long live the rainbow! – Part 2: a rainbow puzzle

The rainbow is dead…long live the rainbow! – Part 3

The rainbow is dead…long live the rainbow! – Part 4 – CIE Lab heated body

The rainbow is dead…long live the rainbow! – Part 5 – CIE Lab linear L* rainbow

The rainbow is dead series – Part 6 -Comparing color palettes

The rainbow is dead series – Part 7 – Perceptual rainbow palette – the method

The rainbow is dead series – Part 7 – Perceptual rainbow palette – the godies

The rainbow is dead…long live the rainbow! – The rainbow is dead…long live the rainbow! – Perceptual palettes, part 2: a rainbow puzzle

ROYGBIV or YOGRVIB?

If you are interested in the topic of color palettes for scientific data, and the rainbow in particular, I would say you ought to read this 2007 IEEE visualization paper by Borland and Taylor: Rainbow Color Map (Still) Considered Harmful. It clearly and elegantly illustrates why the rainbow palette should be avoided when displaying scientific data. I like Figure 1 in the paper in particular. The illustration shows how it is easy to order perceptually a set of 4 paint chips of different gray intensity, but not at all easy to order 4 paint chips colored red, green, yellow, and blue. The author’s argument is that the rainbow colors are certainly ordered, from shorter to longer wavelengths, but they are not perceptually ordered. In this post I wanted to extend the chips example to all 7 colors in the rainbow and try to demonstrate the point in a quantitative way.

Here below is a 256-sample rainbow palette I created interpolating between the RGB values for the seven colors of the rainbow red, orange, yellow, green, blue, indigo, and violet (ROY G BIV):

On this palette I see a number of perceptual artifacts, the most notable ones being a sharp edge at the yellow and a flat zone at the green. The existence of these edges I tried to explain quantitatively in the first post of this series.

Now, to go back to the experiment, from the original RGB values for the non interpolated colors I created the 7 color chips below . Question: can you order them based on their perceived intensity?

I think if you have full color vision (more on the topic of rainbow and impaired color vision in the next section of this post) eventually you will be able to order them as I did.If not, try now below. In this new image I converted the color chips to gray chips using the values obtained in Matlab with this formula:

INT = (0.2989 * RGB(:,1) + 0.5870* RGB(:,2) + 0.1140 * RGB(:,3))';

Give it a try, then hover with your mouse over the image to read the intensity values.

roygbiv_intensityroygbiv_intensity_values

Not surprisingly, the values are not in any particular order. This reinforces the notion that although the rainbow colors are ordered by increasing wavelength (or decreasing in this case) , they are not perceptually ordered. (See this comment to my previous post). Below I rearranged the gray chips by increasing intensity.

And now I reconverted from gray to RGB colors and adjusted the distance between each pair of chips so that it is proportional to the intensity difference between the chips in the pair (I actually had to artificially change the value for green and orange so they would not overlap). That was an epiphany for me. And the name is funny too, BIV R GOY, or YOG R VIB…

I said that it was an epiphany because I realize the implications of trying to create a palette by interpolating through these colors with those distances. So I did it, and I am showing it below in the top color palette. We jumped out of the frying pan, into the fire! We went from perceptual artifacts that are inherent to the rainbow (reproduced in reverse order from blue to red to facilitate comparison as the bottom palette) to interpolation artifacts in the intensity ordered rainbow. Hopeless!

ROYGBIV puzzle

As if what I have shown in the previous section wasn’t scary enough, I took 7 squares and colored them using the same RGB values for Red, Orange, Yellow, Green, Blue, Indigo, and Violet. Then I used the Dichromacy plug-in in ImageJ to simulate how these colors would be seen by a viewer with Deuteranopia (the more common form of color vision deficiency). I then shuffled the squares in random order on a square canvas, and numbered them 1-7 in clockwise order.

Puzzle: can you pair the squares numbered 1 through 7 with the colors R though V? I will give away the obvious one, which is the yellow:

1=Y
2=?
3=?
4=?
5=?
6=?
7=?

Cannot do it? For the solution just hover over the image with your mouse. If you like the animation and would like to use it on your blog, twitter, Facebook, get the GIF file version here. Please be kind enough to link it back to this post.

roygbiv_random_deuteranoperoygbiv_random

Conclusion

When I tried myself I could not solve the puzzle, and that finally convinced me that trying to fix the rainbow was a hopeless cause. Even if we could, it would still confuse a good number of people (about 8% of male have one form or the other of color vision deficiency). From the next post on I will show what I got when I tried to create a better, more perceptual rainbow from scratch.

Related posts (MyCarta)

The rainbow is dead…long live the rainbow! – the full series

What is a colour space? reblogged from Colour Chat

Color Use Guidelines for Mapping and Visualization

A rainbow for everyone

Is Indigo really a colour of the rainbow?

Why is the hue circle circular at all?

A good divergent color palette for Matlab

Related topics (external)

Color in scientific visualization

The dangers of default disdain

Color tools

How to avoid equidistant HSV colors

Non-uniform gradient creator

Colormap tool

Color Oracle – color vision deficiency simulation – stand alone (Window, Mac and Linux)

Dichromacy –  color vision deficiency simulation – open source plugin for ImageJ

Vischeck – color vision deficiency simulation – plugin for ImageJ and Photoshop (Windows and Linux)

For teachers

NASA’s teaching resources for grades 6-9: What’s the Frequency, Roy G. Biv?