Android seismograph app evaluation

Introduction

Over the last couple of years I have looked at a number of apps of all sorts. Some were seismometers. Out of those, two had the desired (to me) 3-component recording and export capabilities: Seismometer by Yellowagents, which is for for iPhone, and Seismograph alpha by Calvico, which is for Android and it’s the one I am talking about in this post.

These are several screen captures from Seismograph’s download page.

seismograph_capture

seismograph alpha

This looked good, so I downloaded it, and played with it a bit.

Functionality

You can change the color of the three components in the settings. I moved away from the default green, red, and blue (below, left) since the green and red can be confusing for color blind viewers, and used magenta, green, and black, which are less confusing (below, right). I also switched from black to white background and increased the saturation of the new colors.

colors

You can also change the recording sensitivity on the fly with the pinch, see below.

sensitivity

Really cool. But after the initial excitement, I moved on. Because really, how many times can you tap the phone from every possible direction, export the data, load it in Excel, stare at it for a while, show it to your colleagues at work, and friends and family at home?

Here comes the dynamite

I recently had an opportunity to go back to this app and record some real data from a dynamite blast in a construction site!

Next to my office in Stavanger they have been building a new condo complex (personal note: as I publish this I am no longer in Stavanger, I moved back to Canada). Of course this being Norway, they soon had to deal with granite, and a whole pile of it.

OK, perhaps not so much rock and not as much as 25 tons of explosive at once, but you can see in the picture below (from Google Street view) that there’s a good amount of rock outcropping behind the parking lot and to the right of the car (and underneath it).

before

before

All that rock had to go, and so it is that for more than 3 weeks, there were a couple of charges shot every day. This is very good as once I set my mind on recording one shot, it took me a few experiments to get the setup right. But before I get to that, here is a photo I took a few weeks after the works had started, right after I recorded the shot I am using for this post.

after

after

The next two photos below show, respectively, one of the several excavation fronts, and the big pile of granite removed.

Dig2

Looking at the excavation front

….

Dig1

The big pile

Recording of the raw data

As I said it was good there were so many shots as it took me some time to get the right setting and a good recording. The two biggest issues were:

– the positioning of the phone: I started in my office, first placing the phone on the window sill, but I soon realize the ‘coupling’ was not so good as the sill was aluminum and a bit shaky. The floor was a lot better, but I finally tried on cement in the parking lot outside of the building, which proved even better.

– the phone lock: I figured out that the recording stops when the phone locks and it is resumed when you unlock it. The first time this happened right before the shot, which I lost. Also, having to touch your phone to unlock it while recording will result in the recording of a very noisy section.

I finally got to get a good recording. Here’s the raw data for those readers that may want to play with it.

Manipulating the data

This is the part that took me a bit of work. The first thing to do is to generate a time vector from the recorded date column. This is a fairly common task when working with field data. In the code box below I added the first 50 rows of time data. The date, which I called calendar time, in the second column. The first column, with sample number, was not in the recorded data, I added it here for reference. The third column is the output time vector, which is cumulative time. To get to this result I copied the milliseconds only from the calendar time column. I used UltraEdit, my favourite text editor, which allows to work with columns and column portions, not just rows.

Then I used a MS Excel formula to generate the cumulative time vector. Once I got to this I recognized that the sample rate is variable. I added a column called dt, which is the difference between pairs of consecutive samples, to make it more clear. I am not too sure if THIS is a common occurrence or not. In my MSc thesis worked with ORION land seismometers equipped with three-component geophones and had to correct for drift and with dead time (a short interruption in recording every minute due to GPS clock updates) but not with uneven sample rate. Has anyone seen this before?

sample, calendar time, t(ms), dt(ms)

1,  013 06 27 12:19:36.972, 0,   11 
2,  013 06 27 12:19:36.983, 11,  5 
3,  013 06 27 12:19:36.988, 16,  2 
4,  013 06 27 12:19:36.990, 18,  2 
5,  013 06 27 12:19:36.992, 20,  3 
6,  013 06 27 12:19:36.995, 23,  3 
7,  013 06 27 12:19:36.998, 26,  1 
8,  013 06 27 12:19:36.999, 27,  2 
9,  013 06 27 12:19:37.001, 29,  2 
10, 013 06 27 12:19:37.003, 31,  2 
11, 013 06 27 12:19:37.005, 33,  2 
12, 013 06 27 12:19:37.007, 35,  2 
13, 013 06 27 12:19:37.009, 37,  9 
14, 013 06 27 12:19:37.018, 46,  2 
15, 013 06 27 12:19:37.020, 48,  17 
16, 013 06 27 12:19:37.037, 65,  1 
17, 013 06 27 12:19:37.038, 66,  2 
18, 013 06 27 12:19:37.040, 68,  34 
19, 013 06 27 12:19:37.074, 102, 2 
20, 013 06 27 12:19:37.076, 104, 2 
21, 013 06 27 12:19:37.078, 106, 2 
22, 013 06 27 12:19:37.080, 108, 2 
23, 013 06 27 12:19:37.082, 110, 2 
24, 013 06 27 12:19:37.084, 112, 2 
25, 013 06 27 12:19:37.086, 114, 5 
26, 013 06 27 12:19:37.091, 119, 2 
27, 013 06 27 12:19:37.093, 121, 4 
28, 013 06 27 12:19:37.097, 125, 34 
29, 013 06 27 12:19:37.131, 159, 5 
30, 013 06 27 12:19:37.136, 164, 2 
31, 013 06 27 12:19:37.138, 166, 2 
32, 013 06 27 12:19:37.140, 168, 4 
33, 013 06 27 12:19:37.144, 172, 2 
34, 013 06 27 12:19:37.146, 174, 14 
35, 013 06 27 12:19:37.160, 188, 5 
36, 013 06 27 12:19:37.165, 193, 14 
37, 013 06 27 12:19:37.179, 207, 7 
38, 013 06 27 12:19:37.186, 214, 11 
39, 013 06 27 12:19:37.197, 225, 12 
40, 013 06 27 12:19:37.209, 237, 17 
41, 013 06 27 12:19:37.226, 254, 2 
42, 013 06 27 12:19:37.228, 256, 14 
43, 013 06 27 12:19:37.242, 270, 7 
44, 013 06 27 12:19:37.249, 277, 11 
45, 013 06 27 12:19:37.260, 288, 16 
46, 013 06 27 12:19:37.276, 304, 3 
47, 013 06 27 12:19:37.279, 307, 14 
48, 013 06 27 12:19:37.293, 321, 8 
49, 013 06 27 12:19:37.301, 329, 11 
50, 013 06 27 12:19:37.312, 340, 7

To get around this I decided to resample the data, which I did in Matlab.

I used a new sample rate of 10 ms after taking the average of all values in the dt column, which turned out to be almost exactly 10 ms. I suspect this may be the ‘nominal’ sample rate for the app, although I could not find confirmation of this anywhere.

Here’s the interpolated time vector (although it is not really interpolated), which I used to resample (this time there was actual interpolation) the x,y, and z components. For those interested, here is the final ASCII file with interpolated x,y,z,time.

First quick look

Below is a plot of the data, which I generated in Matlab. There are two sections of high signal in all three recorded components. The first one is at the beginning of the recording, and it is due to the phone still moving for a couple of seconds after I touched it to star the recording. The second section, right after 2 x 10^4 ms, is the recorded shot.

3-C

Future posts

That’s it for this post. In my next post or two I will take a closer look at the data: I will use a tap test to assess polarity, will run a spectral analysis, and try hodogram plots, and hopefully much more.

Related posts (external)

Scientists want to turn smartphones into earthquake sensors