Visualizing colormap artifacts

In Evaluate and compare colormaps, I have shown how to extract and display the lightness profile of a colormap using Python. I do this routinely with colormaps, but I realize it takes an effort, and not all users may feel comfortable using code to test whether a colormap is perceptual or not.

This got me thinking that there is perhaps a need for a user-friendly, interactive tool to help identify colormap artifacts, and wondering how it would look like.

In a previous post, Comparing color palettes, I plotted the elevation for the South American continent from the Global Land One-km Base Elevation Project using four different color palettes. In Figure 1 below I plot again 3 of those: rainbow, linear lightness rainbow, and grayscale, respectively, from left to right. In maps like these some artifacts are very evident. For example there’s a classic film negative effect in the map on the left, where the Guiana Highlands and the Brazilian Highlands, both in blue, seem to stand lower than the Amazon basin, in violet. This is due to the much lower lightness (or alternatively intensity) of the colour blue compared to the violet.

South_America_maps_post

Figure 1

 

However, other artifacts are more subtle, like the inversion of the highest peaks in the Andes, which are coloured in red, relative to their surroundings, in particular the Altipiano, an endorheic basin that includes Lake Titicaca.

My idea for this tool is simple, and consists of two windows. The first is a basemap window which can display either a demo dataset or user data loaded from an ASCII grid file. In this window the user would interactively select a profile by building a polyline with point-and-click, like the one in Figure 2 in white.

Figure 2

The second window would show the elevation profile with the colour fill assigned based on the colormap, like in Figure 3 at the bottom (with colormap to the right), and with a profile of the corresponding colour intensities (on a scale 1-255) at the top.

In this view it is immediately evident that, for example, the two highest peaks near the center, coloured in red, are relative intensity lows. Another anomaly is the absolute intensity low on the right side, corresponding to the colour blue, where the elevation profile varies smoothly.

Figure 3

I created this concept prototype using a combination of Matlab, Python, and Surfer. I welcome suggestions for possible additional features, and would like to hear form folks interested in collaboration on a web app (ideally in Python).

Moiré Patterns

Moiré pattern

Some time ago I reblogged a post from El Ojo Inoportuno showing Moiré pattern, which resulted from taking a photo of a circular pattern of (beautiful) tiles. This phenomenon is caused by undersampling and is also called space aliasing. There’s a very good explanation of space aliasing and another stunning Moiré example on Agile Geoscience’s post N is for Nyquist.

Creating Moiré patterns

One way to get Moiré pattern is to superimpose two identical, transparent line gratings and rotate one by an angle. You can see an animation of this on Wolfram Mathworld here; notice that the pattern varies with the angle. In the same page there’s also an example of Moiré Patterns generated by plotting series of curves on a computer screen, which is very similar to taking the photo of circular tiles shown in the Ojo Inoportuno photo. Again the interference is caused by representing circles with a finite size pixel grid. If you are interested you can experiment with these effects and many more by downloading templates from this site. Figure 1 shows my own Moiré from circular patterns.

Figure 1

 

There is a program for interactive Moiré pattern experiments called iMoiré.

Another way to get a Moiré pattern is to scan a picture printed with halftone. There’s a simple explanation of this scanning-generated interference here. Again this is a matter of aliasing, or undersampling. Here’s a good example:

Figure 2

The original image is a lovely watercolor by  Ettore Roesler Franz showing medieval houses along the Tiber river in Rome. The Moiré Pattern results from scanning the watercolor from one of the book collections (the image was posted on Flickr here).

How to remove Moiré pattern from digital images

For a quick solution, there’s a good article with detailed instructions on how to remove Moiré pattern in Photoshop, Paint Shop Pro, etcetera. For a more advanced workflow there’s an excellent top hat filter in Photoshop included in Reindeer Graphic’s FoveaPro plugin. In Figure 3, I created a sort of pictorial chart of this workflow using low resolution copies of examples in The Image Processing Cookbook, by John C. Russ.

Figure 3

 

In future posts I plan to show how to remove Moire’ pattern with open source code images  using Python, and then to extend the workflow to the removal (or attenuation) of acquisition footprint in seismic data, which has a very similar appearance in the 2D Fourier domain, and can be filtered with very similar techniques.

 

 

What your brain does with colours when you are not “looking” – part 2

In What your brain does with colours when you are not “looking”, part 1, I displayed some audio spectrogram data (courtesy of Giuliano Bernardi at the University of Leuven) using 5 different colormaps to render the amplitude values: Jet (until recently Matlab’s standard colormap), grayscale, linear lightness rainbow, modified heated body, and cube lightness rainbow. I then asked readers to cast a vote for what they thought was the best colormap to visualize this dataset.

I was curious to see how all these colormaps fared, but my expectation was that Jet would sink to the bottom.  I was really surprised to see it came on top, one vote ahead of the linear lightness rainbow (21 and 20 votes out of 62, respectively). The modified heated body followed with 11 votes.

My surprise comes from the fact that Jet carries perceptual artifacts within the progression of colours (see for example this post). One way to demonstrate these artifacts is to convert the 2D map into a 3D surface where again we use Jet to colour amplitude values, but we use the intensities from the 2D map for the elevation. This can be done for example using the Interactive 3D Surface Plot plugin for ImageJ (as in my previous post ). The resulting surface is shown in Figure 1. This is almost exactly what your brain would do when you look at the 2D map colored with Jet in the previous post.

Figure 1

In Figure 2 the same data is now displayed as a surface where amplitude values were used for the elevation, with a very light sun shading to help a bit with the perception of relief, but no colormap at all. to When comparing Figure 1 with Figure 2 one of the artifacts is immediately recognized: the highest values in Figure 2, which honours the data, become a relative low in Figure 1. This is because red has lower intensity than yellow and therefore data colored in red in 2D are plotted at a lower elevation than data colored in yellow, even though the amplitudes of the latter were lowest.

Figure 2

For these reasons, I did not expect Jet to be the top pick. On the other hand, I think Jet is perhaps favoured because with consistent use, our brain, learns in part to accommodate for these non-perceptual artifacts in 2D maps, and because it has at least two regions of higher contrast (higher magnitude gradient) than other colormaps. Unfortunately, as I wrote in a recently published tutorial, these regions are randomly placed in the colormap, and the gradients are variable, so we gain on contrast but lose on faithfulness in representing the data structure.

Matt Hall wrote a great comment following the previous post, really making an argument for switching between multiple colormaps in the interpretation stage to explore  and highlight features in both the signal and the noise in the data, and that perhaps no single colormap is best overall. I agree 100% on almost everything Matt said, except perhaps on the best overall: looking at the 2D maps, at least with this dataset, I feel the heated body could be the best overall colormap, even if marginally. In Figure 3, Figure 4, Figure 5, and Figure 6 I show the 3D displays obtained by converting the 2D grayscale, linear lightness rainbow, modified heated body, and cube llightness rainbow, respectively. Looking at the 3D displays altogether gives me a confirmation of that feeling.

What do you think?

Figure 3

Figure 4

Figure 5

Figure 6

Geophysical tutorial – How to evaluate and compare colormaps in Python

These below are two copies of a seismic horizon from the open source Penobscot 3D seismic survey  coloured using two different colormaps (data from Hall, 2014).

Figure 1

Do you think either of them is ‘better’?  If yes, can you explain why? If you are unsure and you want to learn how to answer such questions using perceptual principles and open source Python code, you can read my tutorial Evaluate and compare colormaps (Niccoli, 2014), one of the awesome Geophysical Tutorials from The Leading Edge. In the process you will learn many things, including how to calculate an RGB colormap’s intensity using a simple formula:

import numpy as np
ntnst = 0.2989 * rgb[:,0] + 0.5870 * rgb[:,1] + 0.1140 * rgb[:,2] # get the intensity
intensity = np.rint(ntnst) # rounds up to nearest integer

…and  how to display  the colormap as a colorbar with an overlay plot of the intensity as in Figure 2.

Figure 2

 

Reference

Hall, M. (2014) Smoothing surfaces and attributes. The Leading Edge 33, no. 2, 128–129. Open access at: https://github.com/seg/tutorials#february-2014

Niccoli, M. (2014) Evaluate and compare colormaps. The Leading Edge 33, no. 8.,  910–912. Open access at: https://github.com/seg/tutorials#august-2014

New Matlab isoluminant colormap for azimuth data

I recently added to my Matlab File Exchange function, Perceptually improved colormaps, a colormap for periodic data like azimuth or phase. I am going to briefly showcase it using data from my degree thesis in geology, which I used before, for example in Visualization tips for geoscientists – Matlab. Figure 1, from that post, shows residual gravity anomalies in milligals.

Figure 1

Often we’re interested in characterizing these anomalies by calculating the direction of maximum dip at each point on the surface, and for that direction display the azimuth, or dip azimuth.  I’ve done this for the surface of residual anomalies from Figure 1 and displayed the azimuth in Figure 2. Azimuth from 0 to 360 degrees are color-coded using Jet, Matlab’s standard colormap (until recently). Typically I do not trust azimuth values when the dip is close to zero because it is often contaminated by noise so I would use shading to de-saturate the colors where dip has the lowest values, but for ease of discussion I haven’t done so in this case.

Figure 2. Azimuth values color-coded with Jet.

There are two problems with Figure 2. First, the well-known problems with the jet colormap. For example, blue is too dark and blue areas appear as bands of constant colour. Yellow is much lighter than any other colour so we see artificial yellow edges that are not really present in the data. But there is an additional issue in Figure 2 because azimuths close in value to 0 and 360 degrees are colored with blue and red, respectively, instead of a single color as they should, causing an additional artificial edge.

In Figure 3 I recolored the map using a colormap that replicates those used in many geophysical software tools to display azimuth or phase data. This is better because it wraps around at 360 degrees but the perceptual issues are unresolved: in this case red, yellow and blue all appear as sharp perceptual edges.

Figure 3. Azimuth values color-coded with generic azimuth colormap.

 

Figure 4. Azimuth values color-coded with isoluminant azimuth colormap.

 

In Figure 4 I used my new colormap, called isoAZ (for isoluminant azimuth). This colormap is much better because not only does it wraps around at 360 degrees, but also lightness is held constant for all colors, which eliminates the perceptual anomalies. All the artificial yellow, red, and blue edges are gone, only real edges are left. This can be more easily appreciated in the figure below: if you hover with your mouse over it you are able to switch back and forth between Figure 3 and Figure 4.

From an interpretation point of view, azimuths 180 degrees apart are of opposing colours, which is ideal for dip azimuth data because it allows us to easily recognize folds where dips of opposite direction are juxtaposed at an edge. One example is the sharp edge in the northwest quadrant of Figure 4, where magenta is juxtaposed to green. If you look at Figure 1 you see that there’s a relative high in this area (the edge in Figure 4) with dips of opposite direction on either side (East and West, or 0 and 360 degrees).

The colormap was created in the Lightness-Chroma-Hue color space, a polar transform of the Lab color space, where lightness is the vertical axis and at each value of lightness, chroma is the radial coordinate and hue the polar angle. One limitation of this approach is that due to theirregular  shape of the color gamut section at each lightness value, we can never exceed  chroma values  of about 38-40 (at lightness = 65 in Matlab; in Python, with extensive trial and error, I have not been able to go past 36 using the Scikit-image Color module), which make the resulting colors pale, pastely.

it creates For those that want to experiment with it further, I used just a few lines of code similar to the ones below:

radius = 38; % chroma
theta = linspace(0, 2*pi, 256)'; % hue
a = radius * cos(theta);
b = radius * sin(theta);
L = (ones(1, 256)*65)'; % lightness
Lab = [L, a, b];
RGB=colorspace('RGB<-Lab',Lab(end:-1:1,:));

This code is a modification from an example by Steve Eddins on a post on his Matlab Central blog. In Steve’s example the colormap cycles through the hues as lightness increases monotonically (which by the way is an excellent way to generate a perceptual rainbow). In this case lightness is kept constant and hue cycles through the entire 360 degrees and wraps around. Also, instead of using the Image Processing Toolbox, I used  Colorspace, a free function from Matlab File Exchange, for the color space transformations.

For data like fracture orientation where azimuths 180 degrees apart are equivalent it is better to stack two of these isoluminant colormaps in a row. In this way we place opposing colors 90 degrees apart, whereas color 180 degrees apart are the same. You can do it using Matlab commands repmat or vertcat, as below:

radius = 38; % chroma
theta = linspace(0, 2*pi, 128)'; % hue
a = radius * cos(theta);
b = radius * sin(theta);
L = (ones(1, 128)*65)'; % lightness
Lab = [L, a, b];
rgb=colorspace('RGB<-Lab',Lab(end:-1:1,:));
RGB=vertcat(rgb,rgb);

Colormap compromise

At the end the series The rainbow is dead…long live the rainbow! I introduced CubicYF, a rainbow-like color palette with a slightly compressive, monotonic increasing lightness profile. The red color is missing from the palette because green and red at people with deuteranopia, the most common type of color vision deficiency, confuse red and green, so I took one out. An alternative solution to taking the red out is to have a yellow-like color with lower lightness followed by a red-like color of higher lightness, like in the LinearL palette described The rainbow is dead…long live the rainbow! – Part 5. A third solution, which is a compromise in perceptual terms, is CubicL (sometimes I called it cube1), the color palette I used with the gravity data from my degree thesis in geology in the series Visualization tips for geoscientists. An example map from the series in which I used CubicL is in Figure 1 below:

Figure 1

To generate CubicL  I used a compressive function for the increase in lightness very much like the one for CubicYF, except in this case I allowed lightness to decrease from about 90 in the yellow portion of the colormap to about 80 to get to the red. As a result, there is an inversion in the lightness trend, indicated by the arrow in the bottom row in Figure 2. However, I believe this is an acceptable compromise, and one that is pleasing for people accustomed to rainbow (it has red), because the inversion is smooth, with a gentle downward-facing concavity. I also believe this colormap is still a perceptual improvement over the rainbow or spectrum so commonly used in seismic visualization, which typically have at least 3 sharp lightness inversions (indicated by arrows in the top row of Figure 2 below).

Figure 2

CubicL is one of the colormaps available in my Matlab File Exchange function Perceptually Improved Colormaps.

Parula: a new Matlab colormap

Steve Eddins of the Matwork just published a post announcing a new Matlab colormap to replace Jet. It is called Parula (more to come on his blog about this intriguing name).

First impression: Parula looks good.

And while I haven’t had time to take it into Python to run a full perceptual test and into ImageJ for a colour blindness test, as a preliminary test I did convert it to grayscale with an online picture converting tool that uses the lightness information to perform the conversion (instad of just desaturating the colors) and the result shows monotonic changes in gray.

Looks promising… Full test to come.

What your brain does with colours when you are not “looking” – part 1

When I published the last post of my series The rainbow is dead…long live the rainbow! there was a great discussion in the comments section with Giuliano Bernardi, a Ph.D. student at the University of Leuven, on the use of different colour palettes in audio spectrogram visualization.

Since then Giuliano has been kind enough to provide me with the data for one of his spectrograms, so I am resuming the discussion. Below here is a set of 5 figures generated in Matlab from the same data using different colormaps. With this post I’d like to get readers involved and ask to cast your vote for the colormap you prefer, and even drop a line in the comments section to tell us the reason for your preference.

In the second post I’ll show the data displayed with the same 5 colormaps but using a different type of visualization, which will reveal what our brain is doing with the colours (without our full knowledge and consent), and then I will ask again to vote for your favourite.

 

Take Our Poll

 

A – Jet colormap

B – Gray scale

C – Linear Lightness rainbow

D – Modified heated body (linear Lightness)

E – Cube-law Lightness rainbow