Android seismograph app evaluation


Over the last couple of years I have looked at a number of apps of all sorts. Some were seismometers. Out of those, two had the desired (to me) 3-component recording and export capabilities: Seismometer by Yellowagents, which is for for iPhone, and Seismograph alpha by Calvico, which is for Android and it’s the one I am talking about in this post.

These are several screen captures from Seismograph’s download page.


seismograph alpha

This looked good, so I downloaded it, and played with it a bit.


You can change the color of the three components in the settings. I moved away from the default green, red, and blue (below, left) since the green and red can be confusing for color blind viewers, and used magenta, green, and black, which are less confusing (below, right). I also switched from black to white background and increased the saturation of the new colors.


You can also change the recording sensitivity on the fly with the pinch, see below.


Really cool. But after the initial excitement, I moved on. Because really, how many times can you tap the phone from every possible direction, export the data, load it in Excel, stare at it for a while, show it to your colleagues at work, and friends and family at home?

Here comes the dynamite

I recently had an opportunity to go back to this app and record some real data from a dynamite blast in a construction site!

Next to my office in Stavanger they have been building a new condo complex (personal note: as I publish this I am no longer in Stavanger, I moved back to Canada). Of course this being Norway, they soon had to deal with granite, and a whole pile of it.

OK, perhaps not so much rock and not as much as 25 tons of explosive at once, but you can see in the picture below (from Google Street view) that there’s a good amount of rock outcropping behind the parking lot and to the right of the car (and underneath it).



All that rock had to go, and so it is that for more than 3 weeks, there were a couple of charges shot every day. This is very good as once I set my mind on recording one shot, it took me a few experiments to get the setup right. But before I get to that, here is a photo I took a few weeks after the works had started, right after I recorded the shot I am using for this post.



The next two photos below show, respectively, one of the several excavation fronts, and the big pile of granite removed.


Looking at the excavation front



The big pile

Recording of the raw data

As I said it was good there were so many shots as it took me some time to get the right setting and a good recording. The two biggest issues were:

– the positioning of the phone: I started in my office, first placing the phone on the window sill, but I soon realize the ‘coupling’ was not so good as the sill was aluminum and a bit shaky. The floor was a lot better, but I finally tried on cement in the parking lot outside of the building, which proved even better.

– the phone lock: I figured out that the recording stops when the phone locks and it is resumed when you unlock it. The first time this happened right before the shot, which I lost. Also, having to touch your phone to unlock it while recording will result in the recording of a very noisy section.

I finally got to get a good recording. Here’s the raw data for those readers that may want to play with it.

Manipulating the data

This is the part that took me a bit of work. The first thing to do is to generate a time vector from the recorded date column. This is a fairly common task when working with field data. In the code box below I added the first 50 rows of time data. The date, which I called calendar time, in the second column. The first column, with sample number, was not in the recorded data, I added it here for reference. The third column is the output time vector, which is cumulative time. To get to this result I copied the milliseconds only from the calendar time column. I used UltraEdit, my favourite text editor, which allows to work with columns and column portions, not just rows.

Then I used a MS Excel formula to generate the cumulative time vector. Once I got to this I recognized that the sample rate is variable. I added a column called dt, which is the difference between pairs of consecutive samples, to make it more clear. I am not too sure if THIS is a common occurrence or not. In my MSc thesis worked with ORION land seismometers equipped with three-component geophones and had to correct for drift and with dead time (a short interruption in recording every minute due to GPS clock updates) but not with uneven sample rate. Has anyone seen this before?

sample, calendar time, t(ms), dt(ms)

1,  013 06 27 12:19:36.972, 0,   11 
2,  013 06 27 12:19:36.983, 11,  5 
3,  013 06 27 12:19:36.988, 16,  2 
4,  013 06 27 12:19:36.990, 18,  2 
5,  013 06 27 12:19:36.992, 20,  3 
6,  013 06 27 12:19:36.995, 23,  3 
7,  013 06 27 12:19:36.998, 26,  1 
8,  013 06 27 12:19:36.999, 27,  2 
9,  013 06 27 12:19:37.001, 29,  2 
10, 013 06 27 12:19:37.003, 31,  2 
11, 013 06 27 12:19:37.005, 33,  2 
12, 013 06 27 12:19:37.007, 35,  2 
13, 013 06 27 12:19:37.009, 37,  9 
14, 013 06 27 12:19:37.018, 46,  2 
15, 013 06 27 12:19:37.020, 48,  17 
16, 013 06 27 12:19:37.037, 65,  1 
17, 013 06 27 12:19:37.038, 66,  2 
18, 013 06 27 12:19:37.040, 68,  34 
19, 013 06 27 12:19:37.074, 102, 2 
20, 013 06 27 12:19:37.076, 104, 2 
21, 013 06 27 12:19:37.078, 106, 2 
22, 013 06 27 12:19:37.080, 108, 2 
23, 013 06 27 12:19:37.082, 110, 2 
24, 013 06 27 12:19:37.084, 112, 2 
25, 013 06 27 12:19:37.086, 114, 5 
26, 013 06 27 12:19:37.091, 119, 2 
27, 013 06 27 12:19:37.093, 121, 4 
28, 013 06 27 12:19:37.097, 125, 34 
29, 013 06 27 12:19:37.131, 159, 5 
30, 013 06 27 12:19:37.136, 164, 2 
31, 013 06 27 12:19:37.138, 166, 2 
32, 013 06 27 12:19:37.140, 168, 4 
33, 013 06 27 12:19:37.144, 172, 2 
34, 013 06 27 12:19:37.146, 174, 14 
35, 013 06 27 12:19:37.160, 188, 5 
36, 013 06 27 12:19:37.165, 193, 14 
37, 013 06 27 12:19:37.179, 207, 7 
38, 013 06 27 12:19:37.186, 214, 11 
39, 013 06 27 12:19:37.197, 225, 12 
40, 013 06 27 12:19:37.209, 237, 17 
41, 013 06 27 12:19:37.226, 254, 2 
42, 013 06 27 12:19:37.228, 256, 14 
43, 013 06 27 12:19:37.242, 270, 7 
44, 013 06 27 12:19:37.249, 277, 11 
45, 013 06 27 12:19:37.260, 288, 16 
46, 013 06 27 12:19:37.276, 304, 3 
47, 013 06 27 12:19:37.279, 307, 14 
48, 013 06 27 12:19:37.293, 321, 8 
49, 013 06 27 12:19:37.301, 329, 11 
50, 013 06 27 12:19:37.312, 340, 7

To get around this I decided to resample the data, which I did in Matlab.

I used a new sample rate of 10 ms after taking the average of all values in the dt column, which turned out to be almost exactly 10 ms. I suspect this may be the ‘nominal’ sample rate for the app, although I could not find confirmation of this anywhere.

Here’s the interpolated time vector (although it is not really interpolated), which I used to resample (this time there was actual interpolation) the x,y, and z components. For those interested, here is the final ASCII file with interpolated x,y,z,time.

First quick look

Below is a plot of the data, which I generated in Matlab. There are two sections of high signal in all three recorded components. The first one is at the beginning of the recording, and it is due to the phone still moving for a couple of seconds after I touched it to star the recording. The second section, right after 2 x 10^4 ms, is the recorded shot.


Future posts

That’s it for this post. In my next post or two I will take a closer look at the data: I will use a tap test to assess polarity, will run a spectral analysis, and try hodogram plots, and hopefully much more.

Related posts (external)

Scientists want to turn smartphones into earthquake sensors

6 thoughts on “Android seismograph app evaluation

  1. Great post Matteo, I have been looking to make some waveforms with my phone and passing the records back for some analysis.

    By banging my fist against the phone on a desk, on a cushion I was able to make a variety of pulses. Nothing as dramatic as the dynamite example, but still.

    I found that I could produce a decent Ricker wavelet by holding the phone rigidly in my left hand and pounding my right one against my chest like a gorilla. Seriously. I figure it has to do with the de-coupling of the medium (me) that causes a Ricker-like shape opposed to a something that is a minimal phase pulse.

    One nice feature extension would be a count-down timer to press the record button (between yet have a wifi sync with the primer cord!), this would remove the shake at the start of the record.

  2. Also, what a pain in the butt with the time vector, what’s needed is a constant sample rate. I don’t really understand why this isn’t an option. Another must-have feature is being able to select the recording rate. Next question: does the APP or sensor actually record acceleration? If so, does that mean you need to take two anti-derivative to get time series that are displacement as a function of time. I suppose only then could you begin to analyze the time series in a seismic-trace like way.

  3. Matt, Evan

    Thanks for the feedback here and for the tweets.
    I like the suggestion about the timer Evan. I think we should pass all these things, including the constant sample rate, to the developers. I think at this point I may have to take a look at the iPhone app too.

    As for the Ricker wavelet, I’d love to see it. We could add it to one of these posts, or if you feel like writing a few lines I’d be happy to publish a guest post here. Just let me know.

    About accelerometer data. I guess I wasn’t planning so far ahead that I’d given some thought to the reduction of the accelerometer data. I was wrapping up things in Stavanger and all I thought was “get the data right, get all the photos, do some measurements with the theodolite, do a tap test, etcetera etcetera.
    So, very good question, and no, I don’t have an answer. Yet.
    My thesis’ multicomponent ORIONS were ‘plain’ 3-C geophones, and the reduction software automatically output seismic trace. I confess the only time I heard in detail about accelerometers was in a multicomponent course at UofC in 2002, and in Norm Cooper’s acquisition class in 2006, but I don’t recall anything about data reduction. I think next step should be some research, and perhaps give a call to some of the folks in industry that work with multicomponent data, or accelerometers. I just dug out some good CREWES material here and here I plan to go through.

Leave a Reply