Hydrophone arrays, FTW!

Figure 1. Photo of Amy and Emily deploying the array off the back deck of the R/V Ocean Starr.
Figure 1. Photo of Amy and Emily pulling in the array at the end of the day.

We do a lot of things out here on the CalCurCEAS cruise – we play cribbage, we eat cookies, we ride the stationary bicycle – but mostly we do these two things, A LOT:

1) Look for whales and dolphins.
2) Listen for whales and dolphins.

Part 1, the “Look” part, is within the realm of the visual team, who stand bravely in wind and cold and rain (and sometimes gorgeous, balmy sunshine) and scan the surface of the ocean for animals that might pop up to say hi and or take a quick breath of air.

Part 2, the “Listen” part, exists because, well, it’s pretty difficult to see what animals are doing once they dip down beneath the waves. And we all know that’s where the fun stuff happens (synchronized swimming practice? underwater tea parties? you never know!).

Since I’m on the acoustics team, I thought I’d give a little overview of how our operation works.  We eavesdrop on the animals using a pair of hydrophones, which are basically underwater microphones. If we just used one hydrophone, it would be okay, we could hear the animals. But adding a second allows us to not only hear them, but figure out where they are.

Figure 1.  Top: Array geometry, showing how we can tell which direction a sound is coming from. Bottom: Signals measured on each of the hydrophones.
Figure 2. Top: Array geometry, showing how we can tell which direction a sound is coming from. (The poor creature pictured is my sad attempt at a striped dolphin, if you were wondering) Bottom: Signals measured on each of the hydrophones.

The pair of hydrophones are built into the end of a cable that is spooled onto a winch on the back deck. Every morning we head out there, don our life jackets, hard hats, and boots, and reel out the cable until the hydrophones are 300 meters behind us. The hydrophones are spaced one meter apart, and once the vessel is up to speed, they are roughly at the same depth.

The upper part of Figure 2 (blue) shows a cartoon schematic of the hydrophone setup. Here’s what it all means:

H_1 and H_2 – hydrophones #1 and #2

Known quantities:

d_h – distance between H_1 and H_2. For our array this distance is 1 meter

c_w – measured sound speed in water (approximately 1500 m/s, but depends on temperature and salinity)

Measured/derived quantities:

\Delta t – time delay between when the signal arrives at H_1 and H_2

d' – distance associated with the time delay \Delta t, derived using c_w

Unknown quantity:

\theta – angle between the incoming ray path and the array baseline. This is what we’re solving for!

OK, that seems complicated. But feel free to ignore the math, of course. The basic idea is that depending on where the sound comes from, it can arrive at the hydrophones at different times. For example, in the image above, it hits hydrophone 1 first, and after some amount of time, it hits hydrophone 2. The signals from each of the hydrophones are sent upstairs to the acoustics lab (see bottom part of Figure 2). The call shows up at slightly different times on each of the hydrophone channels, and we can measure that time delay \Delta t very precisely.

Using the time delay \delta t and the measured sound speed c_w, we can obtain distance d' using:

d' = c_w * \Delta t

So now we’ve got a right triangle where we know the hypotenuse and one other side, and you know what that means – trigonometry time!! Everyone’s favorite time! We finally have what we need to solve for the angle \theta.

\theta = acos( \frac{d'}{d_h})

We now know what angle the dolphin is at relative to the array. Huzzah! But wait. There are just a couple of little details that you need to remember (see Figure 3). First: you don’t know how far away the dolphin is. Second: there’s this pesky thing called “left-right ambiguity” *.

doge_angles3

From the perspective of the array, there’s no difference between an animal calling from an angle \theta to the left and an animal calling from an angle \theta to the right.

Figure 3. Left-right and range ambiguity.
Figure 3. Left-right and range ambiguity.

These are fundamental limitations of the method, but we can (sort of) overcome them. As the vessel moves along, and we estimate angles at different locations, we end up with a location where most of the bearing lines intersect. If the vessel is traveling in a straight line, we can get a good idea of range – how far the animal is from the trackline. We just won’t know which side of the trackline it’s on. But if the vessel makes a turn, the new bearings estimated after the turn will resolve which side of the line it’s on!

Resolving_ambiguities_600px
Figure 4. Bearing estimates (red lines) taken at different locations along the track line. Probable location is where most of the lines intersect.

At this point you might be wondering, Michelle, what assumptions are you making when you do these calculations? So here they are:

Assumptions**:

  • The array is horizontal
  • The animals are calling at the surface
  • The animals are staying in approximately the same location for the duration of the measurements

So there you have it. That’s how we locate animals underwater with a towed, 2-element hydrophone array.

 

* Yin, one of the amazing visual observers on the cruise, thinks “Left-right ambiguity” would be a great name for a band, and I agree.

** assumptions are made to be broken

Matched filtering animation

Get ready for the most boring animation, ever. EVER! (don’t say I didn’t warn you). I made this for a talk I gave for the MG&G lunch seminar a couple of weeks ago. I wanted to figure out a way to describe how a matched filter works, and found that I was doing lots of crazy hand gestures that weren’t helping me at all. Matlab animation to the rescue!

Let’s say you have your received time series, s(t), and a simulated version of the transmitted call, x(t). The cross correlator output is simply:

y(t) = \int_0^t x(\tau)s(T-t+\tau)d\tau

The function y(t) has peaks that correspond to onset times of the calls in the original time series.

In the top panel, the blue line is the received time series, and the red line is the simulated signal. Buried in the blue time series is the original signal, between seconds 2 and 3. It’s hard to see it by eye, but the matched filter plucks it right out! The black line in the second panel is the output of the cross correlation. The peak aligns exactly with the start time of the signal. Miracle!

Matched filtering video from Michelle Wray on Vimeo.

I know what you’re dying to ask me, and no, Pixar still has not been in touch.

It’s that time of year…

The end of the quarter is here already! That’s right, it’s finals time. In-class finals seem to be a rarity in grad school, but lucky me – I have two (Complex Analysis and Chemical Oceanography)! Both on Monday. The figure above is just to give you a little taste of what my weekend will be like. It is a portion of my formula sheet for the complex analysis exam. We’re allowed one 5″x7″ index card with any formulas or notes we can fit on it.

Just for fun, here’s Mr. Bean taking an exam.

Mount Rainier and chamomile tea

John is in Alaska!  He just left today (as you might have guessed by the drawing), and called to let me know he made it to Anchorage.  In order to divert my attention from the quiet house (and also because I have a project due in less than a week), I spent the day figuring out the physics behind gravity measurements.  Oh, geodesy, so dear to my heart – but the details are so distant in my memory.

Last day of inverse theory :-(

Today was our last inverse theory class.  We didn’t have time to cover the conjugate gradient method, which was too bad, I was really hoping we would get to that stuff.  One of our lab group’s weekly readings was on the double-differencing technique for earthquake location.  In it, they described using a conjugate gradient-type method called LSQR.   It’s supposed to allow you to avoid inverting a giant matrix when solving inverse problems.  From my very basic understanding, it finds the answer in a sort of iterative way, by making successive guesses to find the minimum in some objective surface.  The good news is that Matlab has a canned lsqr function, so I could just try it out and see for myself.  Although the canned function can be a bad thing, because then you can just treat it like a black box.

 

 

Streamlines and statistics

I finally, finally downloaded R today.  I’ve been meaning to check it out for ages.  I only spent about 10 minutes on it once I had it downloaded though – things are too busy to be playing with a new toy.  Next week is finals week.  Probably best to work on the things that have deadlines in the next 7 days.  Particularly those things which require me to give presentations in front of people.  Turns out that the prospect of public humiliation is a pretty decent motivator, actually.  I’ve been working away pretty diligently on getting those projects finished.

Gravity and flowers

It’s been busy around here.  I’m giving a talk at the Acoustical Society of America meeting, in a session on Friday.  Found out that I’d accidentally overwritten the last round of edits that I’d done.  Boo!  Luckily it didn’t take too long to sort it out.  Phew! And of course it helped my mood to see a vase of very pretty orange daisies waiting for me on my desk when I got home 🙂

 

Quadrature signals – tutorial

Once again, I’m doing a bit of signal processing and, as usual, I found myself needing to brush up on the basics.  And one of my favorite little tutorials is called Quadrature Signals:  Complex but not Complicated.  It’s written by Richard Lyons, who’s also the author of what I’ve been told is a good book on signal processing.  I like the writing style – it’s fun and easy to read.  Well, okay, honestly it gets pretty complicated by the end.  But I still like it.

Last night I did my first cross correlation since 2008.  Woo hoo!  And it worked!  Sort of.  It still needs some tweaks, and I also need to improve my input signal.  But it feels good to be using the signal processing toolbox in Matlab again 🙂  Once I get it sorted out a bit more, I hope to post the relevant snippets of code up here.

Quadrature Signals: Complex but not Complicated