Python, Einstein, and the discovery of Gravitational Waves at PyCon Italia

Introduction

At 11:53 am of September 14 2015, Marco Drago, an Italian postdoctoral scientist at the Max Planck Institute for Gravitational Physics (AKA Albert Einstein Institute) in Hannover, Germany, was the first person to see it: the sophisticated instruments at the Laser Interferometer Gravitational-Wave Observatory (LIGO) had detected a very likely candidate signal from gravitational waves. The signal, caused by the collision and merger of two massive black holes, proved to be real; the discovery, announced to the public a few months later, demonstrated one of Albert Einstein’s predictions exactly 100 years after he formulated his general theory of relativity. How exciting!

I was aware that Python was used in both LIGO’s control room and for some of the scientific work and data analysis, through a Python LIGO thread on Reddit. Also, one of the main figures in the discovery paper was made entirely in Python, and used a Matplotlib perceptual colormap (Viridis, the new default in mpl 2.0).

It was not, however, until I decided to attend (virtually, on Youtube) this year’s PyCon Italia that I realized how big a role Python had played. In this post, I will briefly summarize what Franco Carbognani, keynote speaker for day 2 (Python and the dawn of gravitational-wave astronomy), and Tito dal Canton (Python’s role in the detection of gravitational waves) presented on the science and technology of gravitational waves detection, and on Python’s contributions.

The science

Most of us have studied that gravity produces a curvature in spacetime. A less generally known, but major prediction of general relativity is that a perturbation of spacetime produces gravitational waves, which are detected because they stretch and squeeze space producing a measurable strain; this however requires a violent cosmological phenomenon involving large masses, relativistic speeds (approaching the speed of light), and asymmetrical acceleration (an abrupt change in those speeds, such as in a collision).

One such event is that which generated the signal detected in September of last year. It occurred about 1300 million years ago when two large black holes spiralled towards one another and then merged into a single, stationary black hole, losing energy by way of radiating gravitational waves (ripples in space-time) exactly as predicted by Einstein; the whole process took only a few tenths of a second, but involved a total of 65 solar masses (36 + 29), 3 of which were converted, mostly in the final instant of the merger, to gravitational wave energy according to the famous relationship E = mc2. This was so much energy that, had it been  visible (electromagnetic), it would’ve been 50 times brighter than the entire universe.  Yet, even a huge event like that created very small waves (gravity is the weakest of the four fundamental interactions of nature), which merely displaced a pair of free-falling masses placed 3 km apart by a length 10,000 smaller than the diameter of a proton. Measuring this displacement requires very complex arrays of laser beams, mirrors, and detectors measuring interference (interferometers), such as the 2 LIGO in the United States, and VIRGO in Italy.

This video illustrates in detail how VIRGO (and LIGO) is able to detects the very weak  waves; it was produced by Marco Kraan of the National Institute for Subatomic Physics in Amsterdam, and shown during Carbognani’s talk:

The figure below shows an illustration of the event’s 3 main phases and the corresponding, matched signals from the 2 LIGO interferometers.

LIGO detects gravitational waves from merging black holes. Illustration credit: LIGO, NSF, Aurore Simonnet (Sonoma State University).

Python’s many contributions

Franco Carbognani is VIRGO integration manager with the European Gravitational Observatory in Pisa. The portion of his talk focusing on Python and VIRGO’s control systems starts here. He told the audience that where Python played a major role was in the building of a complete automation layer on top of real-time interferometer control, with analysis of data online and warning to the operator in case of anomalies, and also in GUI development and unification. Python was the obvious choice for these tasks because it is a compact, clear language, easy for beginners to learn, and yet allowing very complex programming (functional, object oriented, imperative, exceptions, etc.); its Numpy and Scipy libraries allow handling of the complex math required, without sacrificing speed (thanks to the optimized Fortran and C under the hood); a large collection of other libraries allows for almost any task to be carried out, including (as mentioned above) Matplotlib for publication quality graphs; finally, it is open-source, and many in the community already used it (commissioning, computing, and data analytics groups).

Tito dal Canton is a postdoctoral scientist at the Max Planck Institute in Hannover. The data analysis part of his talk starts here. The workflow he outlined, involving the use of several separate pipelines, consists of retrieving the data, deciding which data can be analyzed, decide if it contains a signal, estimate its statistical significance, and estimating the signal parameters (e.g. masses of stars, spin velocity, and distance) by comparison with a model. A lot of this work is run entirely in Python using either the GWpy and PyCBC packages. For the keener reader, one of the Jupyter Notebooks on the LIGO Python tutorials page, replicates some of the signal processing and data analysis shown during his talk.

Additional resources

MIT-Caltech video on Gravitational Waves detection.

Mapping and validating geophysical lineaments with Python

In Visualization tips for geoscientists: MATLAB, Part III I showed there’s a qualitative correlation between occurrences of antimony mineralization in the southern Tuscany mining district and the distance from lineaments derived from the total horizontal derivative (also called maximum horizontal gradient).

Let’s take a look at the it below (distance from lineaments increases as the color goes from blue to green, yellow, and then red).

regio_distance_mineral_occurrences

However, in a different map in the same post I showed that lineaments derived using the maxima of the hyperbolic tilt angle (Cooper and Cowan, 2006, Enhancing potential field data using filters based on the local phase) are offset systematically from those derived using the total horizontal derivative.

Let’s take a look at the it below: in this case Bouguer gravity values increase as the color goes from blue to green, yellow, and then red; white polygons are basement outcrops.

The lineaments from the total horizontal derivative are in black, those from the maxima of hyperbolic tilt angle are in gray. Which lineaments should be used?

The ideal way to map the location of density contrast edges (as a proxy for geological contacts) would be  to gravity forward models, or even 3D gravity inversion, ideally constrained by all available independent data sources (magnetic or induced-polarization profiles, exploratory drilling data, reflection seismic interpretations, and so on).

The next best approach is to map edges using a number of independent gravity data enhancements, and then only use those that collocate.

Cooper and Cowan (same 2006 paper) demonstrate that no single-edge detector method is a perfect geologic-contact mapper. Citing Pilkington and Keating (2004, Contact mapping from gridded magnetic data – A comparison of techniques) they conclude that the best approach is to use “collocated solutions from different methods providing increased confidence in the reliability of a given contact location”.

I show an example of such a workflow in the image below. In the first column from the left is a map of the residual Bouguer gravity from a smaller area of interest in the southern Tuscany mining district (where measurements were made on a denser station grid). In the second column from the left are the lineaments extracted using three different (and independent) derivative-based data enhancements followed by skeletonization. The same lineaments are superimposed on the original data in the third column from the left. Finally, in the last column, the lineaments are combined into a single collocation map to increase confidence in the edge locations (I applied a mask so as to display edges only where at least two methods collocate).

collocation_teaser

If you want to learn more about this method, please read my note in the Geophysical tutorial column of The Leading Edge, which is available with open access here.
To run the open source Python code, download the iPython/Jupyter Notebook from GitHub.

With this notebook you will be able to:

1) create a full suite of derivative-based enhanced gravity maps;

2) extract and refine lineaments to map edges;

3) create a collocation map.

These technique can be easily adapted to collocate lineaments derived from seismic data, with which the same derivative-based enhancements are showing promising results (Russell and Ribordy, 2014, New edge detection methods for seismic interpretation.)

Perceptual rainbow palette – the goodies

Perceptual rainbow palette – Matlab function and ASCII files

In my last post I introduced cubeYF, my custom-made perceptual lightness rainbow palette. As promised there, I am sharing the palette  with today’s post. For the Matlab users, cube YF, along with the other palettes I introduced in the series, is part of the Matlab File Exchange submission Perceptually improved colormaps.

For the non-Matlab users, please download the cubeYF here (RGB, 256 samples). You may also be interested in cube1, which has a slightly superior visual hue contrast, due to the addition of a red-like color at the high lightness end but at the cost of a modest deviation from 100% perceptual. I used cube 1 in my Visualization tips for geoscientists series.

Perceptual rainbow palette – preformatted in various software formats

The palettes are also formatted for a number of platforms and software products: Geosoft, Hampson-Russell, SMT Kingdom, Landmark Decision Space Geoscience, Madagascar, OpendTect, Python/Matplotlib, Schlumberger Petrel, Seisware, Golden Software Surfer, Paradigm Voxelgeo. Please download them from my Color Palettes page and follow instructions therein.

Another example

In Comparing color palettes I used a map of South America [1] to compare a linear lightness palette to some common rainbow palettes using  grayscale as a perceptual benchmark. Below, I am doing the same for the cubeYF colormap.

South_America_maps_CubeYF_rainbow

Comparison of South America maps using, from left to right: ROYGBIV (from this post) , classic rainbow, cubeYF, and grayscale

Again, there is little doubt in my mind that cubeYF does a superior job compared to the other two rainbow palettes as it is free of artefacts [2] and more similar to grayscale  (with the additional benefit of color).

The ROYGBIV and cubeYF map have been included in Marek Kultys’ excellent tutorial Visual Alpha-Beta-Gamma: Rudiments of Visual Design for Data Explorers, recently published  on Parsons Journal for information mapping, Volume V, Issue 1.

An online palette testing tool

Both cubeYF and cube1 feature in the colormap evaluation tool by the Data Analysis and Assessment Center at the Engineer Research and Development Center. If you want to quickly evaluate a number of palettes, this is the right tool. The tool has a collection of many palettes, organized by categories, which can be used on 5 different test image, and examined in terms of RGB components and human perception. Below here is an example using cube YF.

hpc_terrain

An idea for a palette’s mood test

A few weeks ago, thanks to Matt Hall (@kwinkunks on twitter),  I discovered Colour monitor, a great online tool by Richard Weeler (@Zephyris on twitter). You supply an image; Colour monitor analyses its colors in terms of hue, saturation and luminance and produces a graphical representation of the image’s mood [3]. I thought, what a wonderful idea!

Then I wondered: what if I used this to tell me something about a color palette’s mood? The circular histogram of colors reminded me of the Harmonic templates [4] on the hue wheel from this paper And so I created fat colorbars using the three  palettes I used in the last post, saved them as images, and run the monitor with them. Here below are the results for Matlab jet, Industry Spectrum, and cubeYF. Looking at these palettes in terms of harmony I would say that jet is not very harmonic (too large a portion of the hue circle; the T template, which is the largest, spans 180 degrees), and that the spectrum is terrible.

CubeYF is also exceeding a bit 180 degrees, but looks very close to a T template rotated by 180 degrees (rotations are allowed). So perhaps I could trim it a bit? But to me it looks a lot nicer and gives me a vibe of really good mood, and reminds me of one of those beautiful central american headdresses, like Moctezuma’s crown.

jet-clrmp-mood

Jet mood

Spectrum-industry-clrmp-mood

Spectrum mood

cubeYF-clrmp-mood

cubeYF mood

Notes

[1] Created with data from the Global Land One-km Base Elevation Project at the National Geophysical Data Center.

[2] Looking at the intensity of the colorbars may help in the assessment: the third and fourth colorbars are very similar and both look perceptually linear, whereas the first and second do not.

[3] Quoted from Richard’s blog post: “… in the middle is a circular histogram of the colours (spectral shades) in the image, and gives an idea of how much of each colour there is. Up the left is a histogram of image brightness (lightness of colour), and up the right is a histogram of colour saturation (vibrancy)”.

[4] Quoted from the paper’s abstract: “Harmonic colors are sets of colors that are aesthetically pleasing in terms of human visual perception. If you are interested in this idea there is a set of slides and a video on the author’s website

Related posts

Perceptual rainbow palette – the method

With this post I would like to introduce my new, perceptually balanced rainbow color palette. I used the palette for the first time in How to assess a colourmap, an essay I wrote for 52 Things You Should Know About Geophysics, edited by Matt Hall and Evan Bianco of Agile Geoscience.

In my essay I started with the analysis of the spectrum color palette, the default  in some seismic interpretation softwares, using my Lightness L* profile plot and Great Pyramid of Giza test surface (see this post for background on the tests and to download the Matlab code). The profile and the pyramid are shown in the top left image and top right image in Figure 1, from the essay.

spectrum vs cubeYF

Figure 1

In the plot the value of L* varies with the color of each sample in the spectrum, and the line is colored accordingly. This erratic profile highlights several issues with spectrum: firstly, the change in lightness is not monotonic. For example it increases from black (L*=0) to magenta [M] then drops from magenta to blue [B], then increases again and so on. This is troublesome if spectrum is used to map elevation because it will interfere with the correct perception of relief, particularly if shading is added. Additionally, the curve gradient changes many times, indicating a nonuniform perceptual distance between samples. There are also plateaus of nearly flat L*, creating bands of constant color (a small one at the blue, and a large one at the green [G]).

The Great Pyramid has monotonically increasing elevation (in feet – easier to code) so there should be no discontinuities in the surface if the color palette is perceptual. However, clearly using the spectrum we have introduced many artificial discontinuities that are not present in the data. For the bottom row in FIgure 1 I used my new color palette, which has a nice, monotonic, compressive Lightness profile (bottom left). Using this palette the pyramid surface (bottom right) is smoothly colored, without any perceptual artifact.

This is how I created the palette: I started with RGB triplets for magenta, blue, cyan, green, and yellow (no red), which I converted to L*a*b* triplets using Colorspace transformations, a Matlab function available on the Matlab File Exchange. I modified the new L* values by fitting them to an approximately cube law L* function (this is consistent with Stevens’ power law of perception), and adjusted a* and b* values using Lab charts like the one in Figure 2 (from CIELab Color Space by Gernot Hoffmann, Department of Mechanical Engineering, University of Emden)  to get 5 colors moving up the L* axis along an imaginary spiral (I actually used tracing paper). Then I interpolated to 256 samples using the same ~cube law, and finally reconverted to RGB [1].

L*50_RGBval

Figure 2

There was quite a bit of trial and error involved, but I am very happy with the results. In the animations below I compare the spectrum and the new palette, which I call cubeYF, as seen in CIELab color space. I generated these animations with the method described in this post, using the 3D color inspector plugin in ImageJ:

I also added Matlab’s default Jet rainbow – a reminder that defaults may be a necessity, but in many instances not the ideal choice:

OK, the new palette looks promising, insofar as modelling is concerned. But how would it fare using some real data? To answer this question I used a residual gravity map from my unpublished thesis in Geology at the University of Rome. I introduced this map and discussed the geological context and objectives of the geophysical study in a previous post, so please refer to that if you are curious about it. In this post I will go straight to the comparison of the color palettes; if you are unfamiliar with gravity data, try to imagine negative residuals as elevation below sea level, and positive residuals as elevation above seal level – you won’t miss out on anything.

In Figures 3 to 6 I colored the data using the above three color palettes, and grayscale as benchmark. I generated these figures using Matlab code I shared in my post Visualization tips for geoscientists: Matlab, and I presented three of them (grayscale, Spectrum, and cubeYF) at the 2012 convention of the Canadian Society of Exploration Geophysicists in Calgary (the extended abstract, which I co-authored with Steve Lynch of 3rd Science, is available here).

In Figure 3, the benchmark for the following figures, I use grayscale to represent the data, assigning increasing intensity from most negative gravity residuals in black to most positive residuals in white (as labeled next to the colorbar). Then, I used terrain slope to create shading: the higher the slope, the darker the shading that is assigned, which results in a pseudo-3D display that is very effective (please refer to Visualization tips for geoscientists: Surfer, for an explanation of the method, and Visualization tips for geoscientists: Matlab for code).

Figure 3 - Grayscale benchmark

Figure 3 – Grayscale benchmark

In Figure 4 I color the pseudo-3D surface with the cubeYF rainbow. Using this color palette instead of grayscale allows viewers to appreciate smaller changes, more quickly assess differences, or conversely identify areas of similar anomaly, while at the same time preserving the peudo-3D effect. Now compare Figure 4 with Figure 5, where we use the spectrum to color the surface: this palette introduces several artefacts (sharp edges and bands of constant hue) which confuse the display and interfere with the perception of pseudo-relief, all but eliminating the effect.  For Figure 6 I used Matlab’s default Jet color palette, which is better that the spectrum, and yet the relief effect is somewhat lost (due mainly to a sharp yellow edge and cyan band).

campi cube YF

Figure 4 – cube YF rainbow

campi spectrum

Figure 5 – Industry spectrum

campi jet

Figure 6 – Matlab Jet

It looks like both spectrum and jet are poor choices when used for color representation of a surface, with the new color palette a far superior alternative. In the CSEG convention paper mentioned above (available here) Steve and I went further by showing that the spectrum not only has these perceptual artifacts and edges, but it is also very confusing for viewers with deficient color vision, a condition that occurs in about 8% of Caucasian males. We did that using computer software [2] to simulate how viewers with two types of deficient color vision, Deuteranopia and Tritanopia, would see the two colored surfaces, and we compare the results. In other words, we are now able to see the images as they would see them. Please refer to the paper for a full discussion on these simulation.

In here, I show in Figures  7 to 9 the Deuteranope simulations for cubeYF, spectrum, and jet, respectively. In all three simulations the hue discrimination has decreased, but while the spectrum and jet are now even more confusing, the cubeYF has preserved the relief effect.

Deuteranope Simulation of campi cube YF

Deuteranope Simulation of cube YF

Deuteranope Simulation of campi spectrum

Deuteranope Simulation of Industry spectrum

Deuteranope Simulation of campi jet

Deuteranope Simulation of Matlab Jet

That’s it for today. In my next post, to be published very shortly, you will get the palette, and a lot more.

References

A more perceptual color palette for structure maps, CSEG/CSPG 2012 convention, Calgary

How to assess a colourmap, in 52 Things You Should Know About Geophysics

Notes

[1] An alternative to the method I used would be to start directly in CIELab color space, and use a some kind of spiral *L lightness profile programmatically.  For example:

– Using 3D helical curves from: http://www.mathworks.com/matlabcentral/fileexchange/25177-3d-curves

– Using Archimedes spiral

– Expanding on code by Steve Eddins at Mathworks (A path through L*a*b* color space) in this article , one could create a spiral cube lightness with something like:

%% this creates best-fit pure power law function 
%  Inspired by wikipedia - http://en.wikipedia.org/wiki/Lightness
l2=linspace(1,power(100,0.42),256); 
L2=(power(l2,1/0.42))'; 

%% this makes cielab real cube function spiral 
radius = 50; 
theta = linspace(0.6*pi, 2*pi, 256).'; 
a = radius * sin(theta); b = radius * cos(theta); 
Lab1 = [L2, a, b]; RGB_realcube=colorspace('RGB<-Lab',(Lab1));

[2] The simulations are created using ImageJ, an open source image manipulation program, and the Vischeck plug-in. I later discovered Dichromacy, anther ImageJ plug-in for these simulations, which has the advantage of being an open source plugin. They can also be performed on the fly (no upload needed) using the online tool Color Oracle.

Related posts

Visualization tips for geoscientists: Matlab, part III

 Introduction

Last weekend I had a few hours to play with but needed a short break from writing about color palettes, so I decided to go back and finish up (for now) this series on geoscience visualization in Matlab. In the first post of the series I expanded on work by Steve Eddins at Mathworks on overlaying images using influence maps and demonstrated how it could be used to enhance the display of a single geophysical dataset.

Using transparency to display multiple data sets an example

At the end of the second post I promised I would go back and show an example of using transparency and influence maps for other tasks, like overlaying of different attributes. Here’s my favorite example in Figure 1. The image is a map in pastel colors of the Bouguer Gravity anomaly for the Southern Tuscany region of Italy, with three other layers superimposed using the techniques above mentioned.

It is beyond the objectives of this post to discuss at length about gravity exploration methods or to attempt a full interpretation of the map. I will go back to it at a later time as I am planning a full series on gravity exploration using this data set, but if you are burning to read more about gravity interpretation please check these excellent notes by Martin Unsworth, Professor of Physics at the Earth and Atmospheric Sciences department, University of Alberta, and note 4 at the end of this post. Otherwise, and for now, suffice it to say that warm colors (green to yellow to red) in the Bouguer gravity map indicate, relatively speaking, excess mass in the subsurface and blue and purple indicate deficit of mass in the subsurface.

The black and grey lines are lineaments extracted from derivatives of the Bouguer gravity data using two different methods [1]. The semitransparent, white-filled polygons show the location of some of the  basement outcrops (the densest rocks in this area).

Lineaments extracted from gravity data can correspond to contacts between geological bodies of different density, so a correlation can be expected between basement outcrops and some of the lineaments, as they are often placed in lateral contact with much lesser dense rocks. This is often exploited in mineral exploration in areas such as this where mineralization occurs at or in the vicinity of this contacts. As an example, I show in Figure 2 the occurrences (AGIP – RIMIN, unpublished industry report, 1989) of silicization (circles) and antimony deposits (triangles), superimposed on the distance from one of the set of lineaments (warm colors indicate higher distance) from Figure 1.

The fact that different methods give systematically shifted results is a known fact, often due the trade-off between resolution and stability, whereby the more stable methods are less affected by noise, but often produce smoother edges over deeper contacts, and their maxima may not correspond. This is in addition to the inherent ambiguity of gravity data, which cannot, by themselves, be interpreted uniquely. To establish which method might be more correct in this case (none is a silver bullet) I tried to calibrate the results using basement outcrops (i.e. does either method more closely match the outcrop edges?). Having done that, I would have more confidence in making inferences on possible other contacts in the subsurface suggested by lineament. I would say the black lines do a better overall job in the East, the gray perhaps in the West. So perhaps I’m stuck? I will get back to this during my gravity series.

Figure 1

regio_distance_mineral_occurrences

Figure 2

Matlab code

As usual I am happy to share the code I used to make the combined map of Figure 1. Since the data I use is in part from my unpublished thesis in Geology and in part from Michele di Filippo at the University of Rome, I am not able to share it, and you will have to use your own data, but the Matlab code is simply adapted. The code snippet below assume you have a geophysical surface already imported in the workspace and stored in a variable called “dataI”, as well as the outcrops in a variable called “basement”, and the lineaments in “lnmnt1” and “lnmnt2”. It also uses my cube1 color palette.

 
% part 1 - map gravity data
figure; imagesc(XI,YI,dataI); colormap(cube1); hold on;
%
% part 2 - dealing with basement overlay
white=cat(3, ones(size(basement)), ones(size(basement)),...
 ones(size(basement)));
ttt=imagesc(Xb,Yb,white); % plots white layer for basement
%
% part 3 - dealing with lineaments overlays
black=cat(3, zeros(size(lnmnt1)), zeros(size(lnmnt1)),...
 zeros(size(lnmnt1)));
grey=black+0.4;
basement_msk=basement.*0.6;
kkk=imagesc(XI,YI,black); % plots black layer for lineament 1
sss=imagesc(XI,YI,gray); % plots gray layer for lineament 2
hold off
%
% part 4 - set influence maps
set(ttt, 'AlphaData', basement_msk); % influence map for basement
set(kkk, 'AlphaData', lnmnt1); % influence map for linement 1
set(sss, 'AlphaData', lnmnt2); % influence map for linement 2
%
% making it pretty
axis equal
axis tight
axis off
set(gca,'YDir','normal');
set(gcf,'Position',[180 150 950 708]);
set(gcf,'OuterPosition',[176 146 958 790]);

Matlab code, explained

OK, let’s break it down starting from scratch. I want first to create a figure and display the gravity data, then hold it so I can overlay the other layers on top of it. I do this with these two commands:

figure;imagesc(XI,YI,dataI);

hold on;

The layer I want to overlay first is the one showing the basement outcrops. I make a white basement layer covering the full extent of the map, which is shown in Figure 3, below.

Figure 3

I create it and plot it with the commands:

white=cat(3, ones(size(basement)), ones(size(basement)), ones(size(basement)));

ttt=imagesc(Xb,Yb,white);

The handle  ttt is to be used in combination with the basement influence map to produce the partly transparent basement overlay: remember that I wanted to display the outcrops in white color, but only partially opaque so the colored gravity map can still be (slightly) seen underneath. I make the influence map, shown in Figure 4, with the command:

basement_msk=basement.*0.6;

Since the original binary variable “basement” had values of 1 for the outcrops and 0 elsewhere, whit the command above I assign an opacity of 0.6 to the outcrops, which will be applied when the next command, below, is run, achieving the desired result.

set(ttt, ‘AlphaData’, basement_msk); % uses basement influence map

Figure 4

For the lineaments I do things in a similar way, except that I want those plotted with full opacity since they are only 1 pixel wide.

As an example I am showing in Figure 5 the black layer lineament 1 and in Figure 6 the influence map, which has values of 1 (full opacity) for the lineament and 0 (full transparency) for everywhere else.

Figure 5

Figure 6

Now a few extra lines to make things pretty, and this is what I get, shown below in Figure 7: not what I expected!

Figure 7

The problem is in these two commands:

white=cat(3, ones(size(basement)), ones(size(basement)), ones(size(basement)));

ttt=imagesc(Xb,Yb,white);

I am calling the layer white but really all I am telling Matlab is to create a layer with maximum intensity (1). But the preceding colormap(cube1) command assigned a salmon-red color to the maximum intensity in the figure, and so that is what you get for the basement overlay.

Again, to get the result I wanted, I had to come up with a trick like in the second post examples. This is the trick:

I create a new color palette with this command:

cube1edit=cube1; cube1edit(256,:)=1;  

The new color palette has last RGB triplet actually defined as white, not salmon-red.

Then I replace this line:

figure; imagesc(XI,YI,dataI); colormap(cube1); hold on;

with the new line:

figure; imagesc(XI,YI,dataI, [15 45]); colormap (cube1edit); hold on;

The highest value in dataI is around 43. By spreading the color range from [15 43] to [15 45], therefore exceeding max(dataI) I ensure that white is used for the basement overlay but not in any part of the map where gravity is highest but there is no basement outcrop. In other words, white is assigned in the palette but reserved to the overlay.

Please let me know if that was clear. If it isn’t I will try to describe it better.

Notes

[1] One method is the total horizontal derivative. The other method is the hyperbolic tilt angle – using Matlab code by Cooper and Cowan (reference). This is how I produced the two overlays:  first I calculated the total horizontal derivative and the tilt angle, then I found the maxima to use as the overlay layers. This is similar to Figure 3e in Cooper and Cowan, but I refined my maxima result by reducing them to 1-pixel-wide lines (using a thinning algorithm).

Reference

Cooper, G.R.J., and Cowan, D.R. (2006) – Enhancing potential field data using filters based on the local phase  Computers & Geosciences 32 (2006) 1585–1591

Related posts (MyCarta)

Visualization tips for geoscientists: Surfer

Visualization tips for geoscientists: Matlab

Visualization tips for geoscientists: Matlab, part II

Image Processing Tips for Geoscientists – part 1

The rainbow is dead…long live the rainbow! – The rainbow is dead…long live the rainbow! – Perceptual palettes, part 4 – CIE Lab heated body

  In my last post I discussed the two main issues with the rainbow color palette from the point of view of human color vision, and concluded one of these issues is insurmountable.

But before I move to presenting alternative color palettes, let me give you one last example of how bad the rainbow is. It was sent to me by Antony Price, a member of the LinkedIn group Worldwide Geophysicists. Antony created a grayscale and a rainbow-colored version – using the same data range and number of intervals – of the satellite altimeter derived free-air gravity map of the world [1].  I am showing the two maps below.

Continue reading

Visualization tips for geoscientists: Matlab, part II

Introduction

In my previous post on this topic I left two loose ends: one in the main text about shading in 3D, and one in the comment section to follow-up on a couple of points in Evan’s feedback. I finally managed to go back and spend some time on those and that is what I am posting about today.

Part 1 – apply shading with transparency in 3D with the surf command

I was trying to write some code to apply the shading with transparency and the surf command. In fact, I’ve been trying, and asking around in the Matlab community for more than one year. But to no avail. I think it is not possible to create the shading directly that way. But I did find a workaround. The breakthrough came when I asked myself this question: can I find a way to capture in a variable the color and the shading associated with each pixel in one of the final 2D maps from the previous post? If I could do that, then it would be possible to assign the colors and shading in that variable using this syntax for the surf command:

surf(data,c);

where data is the gravity matrix and c is the color and shading matrix. To do it in practice I started from a suggestion by Walter Robertson on the Matlab community in his answer to my question on this topic.

The full code to do that is below here, followed by an explanation including 3 figures. As for the other post, since the data set I use is from my unpublished thesis in Geology, I am not able to share it, and you will have to use your own data, but the Matlab code is simply adapted.

%% cell 1
figure;
shadedpcolor(x,y,data,(1-normalise(slope)),[-5.9834 2.9969],[0 1],0.45,cube1,0);
axis equal; axis off; axis tight
shadedcolorbar([-5.9834 2.9969],0.55,cube1);

In cell 1 using again shadedpcolor.mnormalise.m, and cube1 color palette I create the 2D shaded image, which I show here in Figure 1.

Figure 1


Continue reading

Visualization tips for geoscientists – Matlab

Introduction

In my last post I described how to create a powerful, nondirectional shading for a geophysical surface using the slope of the data to assign the shading intensity (i.e. areas of greater slope are assigned darker shading). Today I will show hot to create a similar effect in Matlab.

Since the data set I use is from my unpublished thesis in Geology, I am not able to share it, and you will have to use your own data, but the Matlab code is simply adapted. The code snippets below assume you have a geophysical surface already imported in the workspace and stored in a variable called “data”, as well as the derivative in a variable called “data_slope”.

Method 1 – with a slope mask and transparency

Some time ago I read this interesting Image Processing blog post by Steve Eddins at Mathworks on overlaying images using transparency. I encourage readers to take a look at this and other posts by Steve, he’s great! That particular blog post gave me the idea to use transparency and the slope to create my favorite shading in Matlab.

In addition to the code below you will need normalise.m from Peter Kovesi‘s website, and to import the color palette cube1.

%% alpha transparency code snippet
black = cat(3, zeros(size(data)), zeros(size(data)), ...
    zeros(size(data)));             % make a truecolor all-black image
gray=black+0.2;                     % make a truecolor all-gray image
alphaI=normalise(data_slope);       % create transparency weight matrix
                                    % using data_slope

imagesc(data);colormap(cube1);      % display data
hold on
h = imagesc(gray);                  % overlay gray image on data
hold off
set(h, 'AlphaData', alphaI);        % set transparency of gray layer using
axis equal;                         % weight matrix
axis tight;
axis off;

And here is the result in Figure 1 below – not bad!

Figure 1. Shaded using transparency

Continue reading