Hydrophone arrays, FTW!

Figure 1. Photo of Amy and Emily deploying the array off the back deck of the R/V Ocean Starr.
Figure 1. Photo of Amy and Emily pulling in the array at the end of the day.

We do a lot of things out here on the CalCurCEAS cruise – we play cribbage, we eat cookies, we ride the stationary bicycle – but mostly we do these two things, A LOT:

1) Look for whales and dolphins.
2) Listen for whales and dolphins.

Part 1, the “Look” part, is within the realm of the visual team, who stand bravely in wind and cold and rain (and sometimes gorgeous, balmy sunshine) and scan the surface of the ocean for animals that might pop up to say hi and or take a quick breath of air.

Part 2, the “Listen” part, exists because, well, it’s pretty difficult to see what animals are doing once they dip down beneath the waves. And we all know that’s where the fun stuff happens (synchronized swimming practice? underwater tea parties? you never know!).

Since I’m on the acoustics team, I thought I’d give a little overview of how our operation works.  We eavesdrop on the animals using a pair of hydrophones, which are basically underwater microphones. If we just used one hydrophone, it would be okay, we could hear the animals. But adding a second allows us to not only hear them, but figure out where they are.

Figure 1.  Top: Array geometry, showing how we can tell which direction a sound is coming from. Bottom: Signals measured on each of the hydrophones.
Figure 2. Top: Array geometry, showing how we can tell which direction a sound is coming from. (The poor creature pictured is my sad attempt at a striped dolphin, if you were wondering) Bottom: Signals measured on each of the hydrophones.

The pair of hydrophones are built into the end of a cable that is spooled onto a winch on the back deck. Every morning we head out there, don our life jackets, hard hats, and boots, and reel out the cable until the hydrophones are 300 meters behind us. The hydrophones are spaced one meter apart, and once the vessel is up to speed, they are roughly at the same depth.

The upper part of Figure 2 (blue) shows a cartoon schematic of the hydrophone setup. Here’s what it all means:

H_1 and H_2 – hydrophones #1 and #2

Known quantities:

d_h – distance between H_1 and H_2. For our array this distance is 1 meter

c_w – measured sound speed in water (approximately 1500 m/s, but depends on temperature and salinity)

Measured/derived quantities:

\Delta t – time delay between when the signal arrives at H_1 and H_2

d' – distance associated with the time delay \Delta t, derived using c_w

Unknown quantity:

\theta – angle between the incoming ray path and the array baseline. This is what we’re solving for!

OK, that seems complicated. But feel free to ignore the math, of course. The basic idea is that depending on where the sound comes from, it can arrive at the hydrophones at different times. For example, in the image above, it hits hydrophone 1 first, and after some amount of time, it hits hydrophone 2. The signals from each of the hydrophones are sent upstairs to the acoustics lab (see bottom part of Figure 2). The call shows up at slightly different times on each of the hydrophone channels, and we can measure that time delay \Delta t very precisely.

Using the time delay \delta t and the measured sound speed c_w, we can obtain distance d' using:

d' = c_w * \Delta t

So now we’ve got a right triangle where we know the hypotenuse and one other side, and you know what that means – trigonometry time!! Everyone’s favorite time! We finally have what we need to solve for the angle \theta.

\theta = acos( \frac{d'}{d_h})

We now know what angle the dolphin is at relative to the array. Huzzah! But wait. There are just a couple of little details that you need to remember (see Figure 3). First: you don’t know how far away the dolphin is. Second: there’s this pesky thing called “left-right ambiguity” *.

doge_angles3

From the perspective of the array, there’s no difference between an animal calling from an angle \theta to the left and an animal calling from an angle \theta to the right.

Figure 3. Left-right and range ambiguity.
Figure 3. Left-right and range ambiguity.

These are fundamental limitations of the method, but we can (sort of) overcome them. As the vessel moves along, and we estimate angles at different locations, we end up with a location where most of the bearing lines intersect. If the vessel is traveling in a straight line, we can get a good idea of range – how far the animal is from the trackline. We just won’t know which side of the trackline it’s on. But if the vessel makes a turn, the new bearings estimated after the turn will resolve which side of the line it’s on!

Resolving_ambiguities_600px
Figure 4. Bearing estimates (red lines) taken at different locations along the track line. Probable location is where most of the lines intersect.

At this point you might be wondering, Michelle, what assumptions are you making when you do these calculations? So here they are:

Assumptions**:

  • The array is horizontal
  • The animals are calling at the surface
  • The animals are staying in approximately the same location for the duration of the measurements

So there you have it. That’s how we locate animals underwater with a towed, 2-element hydrophone array.

 

* Yin, one of the amazing visual observers on the cruise, thinks “Left-right ambiguity” would be a great name for a band, and I agree.

** assumptions are made to be broken

Whales, dolphins, and seabirds, oh my!

RVOceanStarr-1

Hi all! I’m on a ship called the R/V Ocean Starr, and we’re out on the 2014 California Current Cetacean and Ecosystem Assessment Survey, or CalCurCEAS for short. We are collecting data that will allow NOAA scientists to estimate the abundance of whales and dolphins off the west coast of the U.S. They’ll also be able to use these data to better understand what affects the distribution of marine mammals – where do they like to hang out, and why? We’re gathering this data using two primary methods: visual and acoustic, and are also conducting photo-ID and biopsy sampling of species of special interest.

In addition to the marine mammal portion of the survey, we’re looking at the pelagic ocean ecosystem in the region.  This means taking measurements of various properties of the water, doing net tows, and using acoustic backscatter to look at biomass in the upper part of the water column. There are also two observers onboard to survey seabirds.

I’m out here with an awesome science team and a great crew. There are two other acousticians besides me: Emily T. Griffiths and Amy Van Cise. Emily has been onboard for the first two legs. She makes sure we stay on track and don’t get into (too much) trouble. Amy is a PhD student at Scripps Institution of Oceanography studying pilot whales, and she’s here for the third leg of the cruise, just like me. The three of us all love ice cream and get along famously.

I have one or two shiny new blog posts that I’m hoping to share soon (with comics! woo!), and I might even have a couple of surprise guest posts! Stay tuned…

First attempt at HDR Photography

I decided to take my camera and tripod out for my walk with Trooper tonight so that I could try out HDR photography (which stands for High Dynamic Range). According to the very minimal amount of research I’ve done, I think that what I am doing may not actually be HDR – something to do with tone mapping vs. exposure blending… well, never mind. I don’t really care what it’s called.  Moving onwards!

I set the camera to aperture priority, put it on a tripod, and took three photos at different exposure settings (-2,0,+2). Here are the original images:

First, with exposure set to -2EV:

Next, with exposure set to 0EV:

Finally, with exposure set to +2EV:

The final result was blended using layer masks in GIMP.  It looks a bit wonky because I did it really quickly, particularly around the edges between the sky and the foreground.  After all that, I pulled it into Adobe Lightroom 4 and made a few tweaks to produce this image:

Some details, if anyone cares:

– Nikon D5100

– 18-55mm (good old kit lens)

– GIMP

– Adobe Lightroom 4

– Wacom Intuos 4 tablet

– MacBook Pro

– cheap-ass plastic tripod (seriously, it was like $9)

Side note:  a giant rat ran out onto the wood chip-path right in front of me while I was setting up.  GROSS!

Oscar the dog

Remember Oscar?  This is him – Steph emailed me the picture today.  John pointed out that he looks really big here, but it’s really an optical illusion.  He’s actually really scrawny and small – 40 pounds or so on the day they found him.  Ian and I were giving him an outdoor bath because he smelled really terrible.  He smells much better these days.  (so I hear)

Camera fun

I want a new camera.  But since I can’t afford one, I’m trying to appreciate the one I already have.  It was probably new around 2003 or so – it’s a Canon Powershot Pro1.  I found it on Craigslist back in Santa Barbara.  Nothing too fancy, but a small step up from a point and shoot.  I know nothing about photography other than what I know from skimming the the Powershot’s manual (which I’ve since misplaced) and occasional googling.  Today’s project was figuring out how to focus on one subject – controlling depth of field.

Yeah, the subject is pretty boring, but I think I’m starting to get how to make my camera do this.  Maybe next I’ll move on to something more interesting.

Study break

It’s finals week, and not much is happening with the blog these days.  I was happy to see that the past few days of rain have successfully scrubbed the sky, and we have some patches of blue.  Days like these make me happy to work in the Marine Sciences building – it’s older, and sometimes it smells funny, but we do have a great view.  (sorry for the wonky picture, it’s from the hipstamatic iphone app that I got a while back, and I still haven’t quite figured it out)