I got this idea of making a modern (no, I am NOT kidding) educational tool to interactively construct and study Ricker wavelets after reading William Ashcroft’s A Petroleum Geologist’s Guide to Seismic Reflection. The book is a gem, in my opinion (I discovered it thanks to a blog post by Matt Hall a couple of years ago) and comes with some interesting tools on a DVD, which are unfortunately a bit outdated. So I went back to my old Geophysics Bible (Yilmaz’ book Seismic Data Analysis), and mashed a few ideas up; you can Download the Excel file here. Please feel free to use it, peruse it, and abuse it, in your classes, labs, nightmares, etcetera…
The tool is fairly simple. In Sheet 1 the user enters the dominant frequency of the desired Ricker wavelet, as shown in the middle of Figure 1. From that informatin the wavelet is constructed using the equation A = g^2 * 1/exp g^2 where g is the ration between frequency f (in increments of 5 Hz up to an arbitrary 125 Hz – but this could be easily changed!) and the dominant frequency f1 just entered. frequencies. The frequency spectrum of the wavelet is shown as a graph.
Figure 1
In Figure 2 I show the same Sheet, but for a wavelet of dominant frequency equal to 50 Hz.
Figure 2
A number of ancillary graphs, shown in Figure 3, display individual building blocks of the formula A = g^2 * 1/exp g^2.
Figure 3
In Sheet 2 the user can view a plot of the wavelet in the time domain, add Constant phase shifts and Linear phase shifts, and experiment with different combinations of them, as shown in Figures 4-7, below.
Figure 4
Figure 5
Figure 6
Figure 7
Finally, Sheet 3 displays a plot of the wavelet, and of the individual frequency components that make it up. Have fun!!!
Figure 8
References and further playing
A Petroleum Geologist’s Guide to Seismic Reflection, William Ashcroft, 2011, Wiley.
Seismic Data Analysis, Öz Yilmaz, 2001, Society of Exploration Geophysicists.
In Visualization tips for geoscientists: MATLAB, Part III I showed there’s a qualitative correlation between occurrences of antimony mineralization in the southern Tuscany mining district and the distance from lineaments derived from the total horizontal derivative (also called maximum horizontal gradient).
Let’s take a look at the it below (distance from lineaments increases as the color goes from blue to green, yellow, and then red).
However, in a different map in the same post I showed that lineaments derived using the maxima of the hyperbolic tilt angle (Cooper and Cowan, 2006, Enhancing potential field data using filters based on the local phase) are offset systematically from those derived using the total horizontal derivative.
Let’s take a look at the it below: in this case Bouguer gravity values increase as the color goes from blue to green, yellow, and then red; white polygons are basement outcrops.
The lineaments from the total horizontal derivative are in black, those from the maxima of hyperbolic tilt angle are in gray. Which lineaments should be used?
The ideal way to map the location of density contrast edges (as a proxy for geological contacts) would be to gravity forward models, or even 3D gravity inversion, ideally constrained by all available independent data sources (magnetic or induced-polarization profiles, exploratory drilling data, reflection seismic interpretations, and so on).
The next best approach is to map edges using a number of independent gravity data enhancements, and then only use those that collocate.
Cooper and Cowan (same 2006 paper) demonstrate that no single-edge detector method is a perfect geologic-contact mapper. Citing Pilkington and Keating (2004, Contact mapping from gridded magnetic data – A comparison of techniques) they conclude that the best approach is to use “collocated solutions from different methods providing increased confidence in the reliability of a given contact location”.
I show an example of such a workflow in the image below. In the first column from the left is a map of the residual Bouguer gravity from a smaller area of interest in the southern Tuscany mining district (where measurements were made on a denser station grid). In the second column from the left are the lineaments extracted using three different (and independent) derivative-based data enhancements followed by skeletonization.The same lineaments are superimposed on the original data in the third column from the left. Finally, in the last column, the lineaments are combined into a single collocation map to increase confidence in the edge locations (I applied a mask so as to display edges only where at least two methods collocate).
1) create a full suite of derivative-based enhanced gravity maps;
2) extract and refine lineaments to map edges;
3) create a collocation map.
These technique can be easily adapted to collocate lineaments derived from seismic data, with which the same derivative-based enhancements are showing promising results (Russell and Ribordy, 2014, New edge detection methods for seismic interpretation.)
I was curious to see how all these colormaps fared, but my expectation was that Jet would sink to the bottom. I was really surprised to see it came on top, one vote ahead of the linear lightness rainbow (21 and 20 votes out of 62, respectively). The modified heated body followed with 11 votes.
My surprise comes from the fact that Jet carries perceptual artifacts within the progression of colours (see for example this post). One way to demonstrate these artifacts is to convert the 2D map into a 3D surface where again we use Jet to colour amplitude values, but we use the intensities from the 2D map for the elevation. This can be done for example using the Interactive 3D Surface Plot plugin for ImageJ (as in my previous post Lending you a hand with image processing – introduction to ImageJ). The resulting surface is shown in Figure 1. This is almost exactly what your brain would do when you look at the 2D map colored with Jet in the previous post.
Figure 1
In Figure 2 the same data is now displayed as a surface where amplitude values were used for the elevation, with a very light sun shading to help a bit with the perception of relief, but no colormap at all. to When comparing Figure 1 with Figure 2 one of the artifacts is immediately recognized: the highest values in Figure 2, which honours the data, become a relative low in Figure 1. This is because red has lower intensity than yellow and therefore data colored in red in 2D are plotted at a lower elevation than data colored in yellow, even though the amplitudes of the latter were lowest.
Figure 2
For these reasons, I did not expect Jet to be the top pick. On the other hand, I think Jet is perhaps favoured because with consistent use, our brain, learns in part to accommodate for these non-perceptual artifacts in 2D maps, and because it has at least two regions of higher contrast (higher magnitude gradient) than other colormaps. Unfortunately, as I wrote in a recently published tutorial, these regions are randomly placed in the colormap, and the gradients are variable, so we gain on contrast but lose on faithfulness in representing the data structure.
Matt Hall wrote a great comment following the previous post, really making an argument for switching between multiple colormaps in the interpretation stage to explore and highlight features in both the signal and the noise in the data, and that perhaps no single colormap is best overall. I agree 100% on almost everything Matt said, except perhaps on the best overall: looking at the 2D maps, at least with this dataset, I feel the heated body could be the best overall colormap, even if marginally. In Figure 3, Figure 4, Figure 5, and Figure 6 I show the 3D displays obtained by converting the 2D grayscale, linear lightness rainbow, modified heated body, and cube llightness rainbow, respectively. Looking at the 3D displays altogether gives me a confirmation of that feeling.
Do you think either of them is ‘better’? If yes, can you explain why? If you are unsure and you want to learn how to answer such questions using perceptual principles and open source Python code, you can read my tutorial Evaluate and compare colormaps (Niccoli, 2014), one of the awesome Geophysical Tutorials from The Leading Edge. In the process you will learn many things, including how to calculate an RGB colormap’s intensity using a simple formula:
import numpy as np
ntnst = 0.2989 * rgb[:,0] + 0.5870 * rgb[:,1] + 0.1140 * rgb[:,2] # get the intensity
intensity = np.rint(ntnst) # rounds up to nearest integer
…and how to display the colormap as a colorbar with an overlay plot of the intensity as in Figure 2.
I recently added to my Matlab File Exchange function, Perceptually improved colormaps, a colormap for periodic data like azimuth or phase. I am going to briefly showcase it using data from my degree thesis in geology, which I used before, for example in Visualization tips for geoscientists – Matlab. Figure 1, from that post, shows residual gravity anomalies in milligals.
Figure 1
Often we’re interested in characterizing these anomalies by calculating the direction of maximum dip at each point on the surface, and for that direction display the azimuth, or dip azimuth. I’ve done this for the surface of residual anomalies from Figure 1 and displayed the azimuth in Figure 2. Azimuth from 0 to 360 degrees are color-coded using Jet, Matlab’s standard colormap (until recently). Typically I do not trust azimuth values when the dip is close to zero because it is often contaminated by noise so I would use shading to de-saturate the colors where dip has the lowest values, but for ease of discussion I haven’t done so in this case.
Figure 2. Azimuth values color-coded with Jet.
There are two problems with Figure 2. First, the well-known problems with the jet colormap. For example, blue is too dark and blue areas appear as bands of constant colour. Yellow is much lighter than any other colour so we see artificial yellow edges that are not really present in the data. But there is an additional issue in Figure 2 because azimuths close in value to 0 and 360 degrees are colored with blue and red, respectively, instead of a single color as they should, causing an additional artificial edge.
In Figure 3 I recolored the map using a colormap that replicates those used in many geophysical software tools to display azimuth or phase data. This is better because it wraps around at 360 degrees but the perceptual issues are unresolved: in this case red, yellow and blue all appear as sharp perceptual edges.
Figure 3. Azimuth values color-coded with generic azimuth colormap.
Figure 4. Azimuth values color-coded with isoluminant azimuth colormap.
In Figure 4 I used my new colormap, called isoAZ (for isoluminant azimuth). This colormap is much better because not only does it wraps around at 360 degrees, but also lightness is held constant for all colors, which eliminates the perceptual anomalies. All the artificial yellow, red, and blue edges are gone, only real edges are left. This can be more easily appreciated in the figure below: if you hover with your mouse over it you are able to switch back and forth between Figure 3 and Figure 4.
From an interpretation point of view, azimuths 180 degrees apart are of opposing colours, which is ideal for dip azimuth data because it allows us to easily recognize folds where dips of opposite direction are juxtaposed at an edge. One example is the sharp edge in the northwest quadrant of Figure 4, where magenta is juxtaposed to green. If you look at Figure 1 you see that there’s a relative high in this area (the edge in Figure 4) with dips of opposite direction on either side (East and West, or 0 and 360 degrees).
The colormap was created in the Lightness-Chroma-Hue color space, a polar transform of the Lab color space, where lightness is the vertical axis and at each value of lightness, chroma is the radial coordinate and hue the polar angle. One limitation of this approach is that due to theirregular shape of the color gamut section at each lightness value, we can never exceed chroma values of about 38-40 (at lightness = 65 in Matlab; in Python, with extensive trial and error, I have not been able to go past 36 using the Scikit-image Color module), which make the resulting colors pale, pastely.
it creates For those that want to experiment with it further, I used just a few lines of code similar to the ones below:
radius = 38; % chroma
theta = linspace(0, 2*pi, 256)'; % hue
a = radius * cos(theta);
b = radius * sin(theta);
L = (ones(1, 256)*65)'; % lightness
Lab = [L, a, b];
RGB=colorspace('RGB<-Lab',Lab(end:-1:1,:));
This code is a modification from an example by Steve Eddins on a post on his Matlab Central blog. In Steve’s example the colormap cycles through the hues as lightness increases monotonically (which by the way is an excellent way to generate a perceptual rainbow). In this case lightness is kept constant and hue cycles through the entire 360 degrees and wraps around. Also, instead of using the Image Processing Toolbox, I used Colorspace, a free function from Matlab File Exchange, for the color space transformations.
For data like fracture orientation where azimuths 180 degrees apart are equivalent it is better to stack two of these isoluminant colormaps in a row. In this way we place opposing colors 90 degrees apart, whereas color 180 degrees apart are the same. You can do it using Matlab commands repmat or vertcat, as below:
radius = 38; % chroma
theta = linspace(0, 2*pi, 128)'; % hue
a = radius * cos(theta);
b = radius * sin(theta);
L = (ones(1, 128)*65)'; % lightness
Lab = [L, a, b];
rgb=colorspace('RGB<-Lab',Lab(end:-1:1,:));
RGB=vertcat(rgb,rgb);
At the end the series The rainbow is dead…long live the rainbow!I introduced CubicYF, a rainbow-like color palette with a slightly compressive, monotonic increasing lightness profile. The red color is missing from the palette because green and red at people with deuteranopia, the most common type of color vision deficiency, confuse red and green, so I took one out. An alternative solution to taking the red out is to have a yellow-like color with lower lightness followed by a red-like color of higher lightness, like in the LinearL palette described The rainbow is dead…long live the rainbow! – Part 5. A third solution, which is a compromise in perceptual terms, is CubicL (sometimes I called it cube1), the color palette I used with the gravity data from my degree thesis in geology in the series Visualization tips for geoscientists. An example map from the series in which I used CubicL is in Figure 1 below:
Figure 1
To generate CubicL I used a compressive function for the increase in lightness very much like the one for CubicYF, except in this case I allowed lightness to decrease from about 90 in the yellow portion of the colormap to about 80 to get to the red. As a result, there is an inversion in the lightness trend, indicated by the arrow in the bottom row in Figure 2. However, I believe this is an acceptable compromise, and one that is pleasing for people accustomed to rainbow (it has red), because the inversion is smooth, with a gentle downward-facing concavity. I also believe this colormap is still a perceptual improvement over the rainbow or spectrum so commonly used in seismic visualization, which typically have at least 3 sharp lightness inversions (indicated by arrows in the top row of Figure 2 below).
Since then Giuliano has been kind enough to provide me with the data for one of his spectrograms, so I am resuming the discussion. Below here is a set of 5 figures generated in Matlab from the same data using different colormaps. With this post I’d like to get readers involved and ask to cast your vote for the colormap you prefer, and even drop a line in the comments section to tell us the reason for your preference.
In the second post I’ll show the data displayed with the same 5 colormaps but using a different type of visualization, which will reveal what our brain is doing with the colours (without our full knowledge and consent), and then I will ask again to vote for your favourite.
There’s something about tiled floors (and tiles in general) that has always mesmerized me and this is a really good one. I like the unusual perspective too.
There’s a radial pattern and a circular pattern of tiles, but if I stare at the photo I also see hints of an interference pattern, an intriguing flower pattern. I think this is a genuine Moiré pattern, one quite well known to photographers, generated by interference between circles of varying distance with the camera’s sensor pixel grid.
In my next post I will look in more detail at Moiré patterns and try to explain how they form.
I will show how the effect can be removed, or at least reduced, particularly from scanned images. Similar techniques can be used to remove acquisition footprint from reflection seismic data, which will be the topic of an upcoming series on MyCarta.