Life after grad school

For a while there, it seemed like I was going to be a graduate student forever…then I graduated and got a job. And then I decided to go *back* to grad school (oh, Michelle…). But then, in December 2016, I successfully defended my dissertation – Huzzah! (I like to tell my nieces I made it to grade 22, how funny/terrifying!) Then about a month and a half after wrapping up my PhD, I started working (remotely) for JASCO Applied Sciences. (oh, and a couple of months after that I had a baby.)

JASCO is a company that does consulting and research for assessing and mitigating underwater noise. They sort of do it all – they design and build super cool underwater acoustic sensors, install those sensors and collect data all around the world, often in remote and dangerous locations. They measure sounds produced by marine animals like whales, dolphins, seals, fish, crabs… basically if it makes a sound and lives underwater, JASCO is probably gonna record it at some point. They also record sounds from noise sources like ships and seismic experiments. Then once all those data are collected, JASCO scientists crunch through it – signal processing, acoustic propagation modeling, interpretation, whatever needs to be done to understand what’s happening in the world of underwater sound.

My job at JASCO is a blend of things – data analysis and visualization, but also education and outreach. Here I am at my home-office, working on a comic about marine noise, and how we can measure it:

Communicating science + baby wearing, for the win.
Communicating science + baby wearing, for the win.
Hopefully I’ll have more posts to share soon, so stay tuned!

Simple Fourier Transform demo


(Go here to access the interactive visualization:

This little project has kept me entertained over the holidays! I thought it would be useful to come up with a really simple interactive web visualization to illustrate what a Fourier Transform is. And of course, it’s also been a great way for me to get some javascript practice.

It’s a really simple example, and could definitely be improved – with the ability to add more signals, for example. And design improvements.

If you want to check out the code, it’s up on my Github page. I decided to try out Plotly for the graphs, and used bootstrap for the layout.

Wave interference

I’m TAing a marine GIS/ocean mapping course this quarter, and will be teaching a lecture on some aspects of multibeam sonars and the data that you get from them. Which is fun since I haven’t really thought much about multibeam sonars since I worked at a sonar company about 6 years ago.

I was trying to figure out how to explain how a longer array means you can get a narrow beam. It’s all about interference patterns, right? So I wrote a little script in Python (you can access the script directly on my Github page).

Let’s say you have two elements spaced a half wavelength apart. You get something like this, with one main lobe:


Cool – there’s just one main lobe of higher amplitude. But then if you pull those elements just a bit further apart – I’m showing a 5 wavelength separation here – you can see a completely different pattern:


So: the wider separation made more beams and and those beams were narrower. Interesting…. What if we wanted to have just one main beam that we could maybe steer around? (ahem. maybe a little like a multibeam sonar??) The next picture shows a single beam produced by a line of 20 elements, all spaced at half a wavelength apart from each other. This time we’re zooming out a bit – showing 30m x 30m this time. Also this simulation shows what it would look like with a 200 kHz signal – which is a pretty common frequency for a shallow water multibeam sonar.

Beam pattern example

Adventures in javascript


After about two frenzied weeks of muddling through javascript and D3 (and html and css) Helena and I managed to wrap up what I think is a pretty neat D3 visualization of oceanographic data for the class we’re taking.

Check it out here: – just click on the map to see the temperature and salinity profiles. And if you’d like to see what’s going on under the hood (or re-create it from scratch), the whole thing is up on our github page (along with a fairly detailed readme file describing our process).

Helena has done some neat D3 stuff before – check it out on her site here.


I’m back! Maybe?

Let’s see, it looks like my last post was in November 2014. What can I say? Sometimes life gets crazy for a bit. It’s still crazy now, but I’m excited about some things that are happening. For example: I’m nearing the end of my PhD! I’m tentatively planning to defend in the next few months. Also, I’m taking a really fun (and very intense) data visualization class. So far I’ve had a chance to play around with Tableau and Trifacta Wrangler. But the biggest learning curve, but with perhaps the biggest reward (ie. awesome interactive web graphics) is D3. I’m hoping to post some stuff as I learn…

For today, I’m not making my own comic. But Helena sent me a link to this xkcd comic that captures my experience with git so far.


Making a comic: Whale Scout edition


I’ve been working on a little comic for It’s taken me a while, but it’s finally done, and will be up on the Whale Scout website soon! Woo!

I used the iPad app Procreate, which I’ve been a fan of for a while now. Years, maybe? I loved it from the get-go, and it really just keeps getting better. I’ve tried a lot of different drawing/painting apps, and this one is by far the best. Highly recommend!  As for ancilliary tools: I used an iPad 3, a Bamboo stylus, and Premiere Pro for some minor editing. I might as well also reveal that I got my best work done while sitting on the couch with Modern Family playing in the background. Don’t question the method people!

One of the features in the Procreate app allows you to export a video that shows the progression of your artwork. I’m actually not sure anyone other than me will be even vaguely interested in watching it, but here it is anyway.

Making a comic: Whale Scout! from Michelle Wray on Vimeo. Music: Something Elated (Broke For Free) / CC BY 3.0

Hydrophone arrays, FTW!

Figure 1. Photo of Amy and Emily deploying the array off the back deck of the R/V Ocean Starr.
Figure 1. Photo of Amy and Emily pulling in the array at the end of the day.

We do a lot of things out here on the CalCurCEAS cruise – we play cribbage, we eat cookies, we ride the stationary bicycle – but mostly we do these two things, A LOT:

1) Look for whales and dolphins.
2) Listen for whales and dolphins.

Part 1, the “Look” part, is within the realm of the visual team, who stand bravely in wind and cold and rain (and sometimes gorgeous, balmy sunshine) and scan the surface of the ocean for animals that might pop up to say hi and or take a quick breath of air.

Part 2, the “Listen” part, exists because, well, it’s pretty difficult to see what animals are doing once they dip down beneath the waves. And we all know that’s where the fun stuff happens (synchronized swimming practice? underwater tea parties? you never know!).

Since I’m on the acoustics team, I thought I’d give a little overview of how our operation works.  We eavesdrop on the animals using a pair of hydrophones, which are basically underwater microphones. If we just used one hydrophone, it would be okay, we could hear the animals. But adding a second allows us to not only hear them, but figure out where they are.

Figure 1.  Top: Array geometry, showing how we can tell which direction a sound is coming from. Bottom: Signals measured on each of the hydrophones.
Figure 2. Top: Array geometry, showing how we can tell which direction a sound is coming from. (The poor creature pictured is my sad attempt at a striped dolphin, if you were wondering) Bottom: Signals measured on each of the hydrophones.

The pair of hydrophones are built into the end of a cable that is spooled onto a winch on the back deck. Every morning we head out there, don our life jackets, hard hats, and boots, and reel out the cable until the hydrophones are 300 meters behind us. The hydrophones are spaced one meter apart, and once the vessel is up to speed, they are roughly at the same depth.

The upper part of Figure 2 (blue) shows a cartoon schematic of the hydrophone setup. Here’s what it all means:

H_1 and H_2 – hydrophones #1 and #2

Known quantities:

d_h – distance between H_1 and H_2. For our array this distance is 1 meter

c_w – measured sound speed in water (approximately 1500 m/s, but depends on temperature and salinity)

Measured/derived quantities:

\Delta t – time delay between when the signal arrives at H_1 and H_2

d' – distance associated with the time delay \Delta t, derived using c_w

Unknown quantity:

\theta – angle between the incoming ray path and the array baseline. This is what we’re solving for!

OK, that seems complicated. But feel free to ignore the math, of course. The basic idea is that depending on where the sound comes from, it can arrive at the hydrophones at different times. For example, in the image above, it hits hydrophone 1 first, and after some amount of time, it hits hydrophone 2. The signals from each of the hydrophones are sent upstairs to the acoustics lab (see bottom part of Figure 2). The call shows up at slightly different times on each of the hydrophone channels, and we can measure that time delay \Delta t very precisely.

Using the time delay \delta t and the measured sound speed c_w, we can obtain distance d' using:

d' = c_w * \Delta t

So now we’ve got a right triangle where we know the hypotenuse and one other side, and you know what that means – trigonometry time!! Everyone’s favorite time! We finally have what we need to solve for the angle \theta.

\theta = acos( \frac{d'}{d_h})

We now know what angle the dolphin is at relative to the array. Huzzah! But wait. There are just a couple of little details that you need to remember (see Figure 3). First: you don’t know how far away the dolphin is. Second: there’s this pesky thing called “left-right ambiguity” *.


From the perspective of the array, there’s no difference between an animal calling from an angle \theta to the left and an animal calling from an angle \theta to the right.

Figure 3. Left-right and range ambiguity.
Figure 3. Left-right and range ambiguity.

These are fundamental limitations of the method, but we can (sort of) overcome them. As the vessel moves along, and we estimate angles at different locations, we end up with a location where most of the bearing lines intersect. If the vessel is traveling in a straight line, we can get a good idea of range – how far the animal is from the trackline. We just won’t know which side of the trackline it’s on. But if the vessel makes a turn, the new bearings estimated after the turn will resolve which side of the line it’s on!

Figure 4. Bearing estimates (red lines) taken at different locations along the track line. Probable location is where most of the lines intersect.

At this point you might be wondering, Michelle, what assumptions are you making when you do these calculations? So here they are:


  • The array is horizontal
  • The animals are calling at the surface
  • The animals are staying in approximately the same location for the duration of the measurements

So there you have it. That’s how we locate animals underwater with a towed, 2-element hydrophone array.


* Yin, one of the amazing visual observers on the cruise, thinks “Left-right ambiguity” would be a great name for a band, and I agree.

** assumptions are made to be broken

Whales, dolphins, and seabirds, oh my!


Hi all! I’m on a ship called the R/V Ocean Starr, and we’re out on the 2014 California Current Cetacean and Ecosystem Assessment Survey, or CalCurCEAS for short. We are collecting data that will allow NOAA scientists to estimate the abundance of whales and dolphins off the west coast of the U.S. They’ll also be able to use these data to better understand what affects the distribution of marine mammals – where do they like to hang out, and why? We’re gathering this data using two primary methods: visual and acoustic, and are also conducting photo-ID and biopsy sampling of species of special interest.

In addition to the marine mammal portion of the survey, we’re looking at the pelagic ocean ecosystem in the region.  This means taking measurements of various properties of the water, doing net tows, and using acoustic backscatter to look at biomass in the upper part of the water column. There are also two observers onboard to survey seabirds.

I’m out here with an awesome science team and a great crew. There are two other acousticians besides me: Emily T. Griffiths and Amy Van Cise. Emily has been onboard for the first two legs. She makes sure we stay on track and don’t get into (too much) trouble. Amy is a PhD student at Scripps Institution of Oceanography studying pilot whales, and she’s here for the third leg of the cruise, just like me. The three of us all love ice cream and get along famously.

I have one or two shiny new blog posts that I’m hoping to share soon (with comics! woo!), and I might even have a couple of surprise guest posts! Stay tuned…

Argo Floats!!

If you have ever sat there and wondered to yourself, “What are Argo Floats? And why are they the coolest things ever?” – Look no further, my friend. I have just the (science-comics) video for you.  It’s based on an interview I did a while back with Rick Rupan, who runs the Argo Floats program at the University of Washington.

This format is a bit experimental for me, but I LOVE this style, and wanted to give it a whirl. I sort of just jumped in, and because I was (am!) so very clueless, I asked basically everyone I know (and some people who I don’t know) to give me feedback. The suggestions started rolling in, and I tried to incorporate as many as I could.

I really hope you enjoy this, and if I’m lucky, you’ll even learn something along the way 🙂 I do plan to make more of these, so if you have any feedback for me after watching this, it would be a great help.

Okay, enough babbling. Go ahead and watch the video already!




Harmful algal bloom forecasting: Chasing toxin producers

It started with “drunk” pelicans. While studying aquatic sciences at the University of Santa Barbara, Liz Tobin noticed that there was something wrong with the fish-gulping birds.  They exhibited unusual behavior, getting hit by cars, and sometimes simply falling out of the sky. The pelicans suffered from seizures induced by domoic acid poisoning, a particularly dangerous affliction when it hit mid-flight.  Biologists traced the source of the poisoning to rapid springtime growth (bloom) of Pseudo-nitzschia, a tiny, single-celled organism, distantly related to the giant kelps thriving in nearby coastal waters.  Small fish and shellfish consume these algae, and although they are not harmed, they store the toxins produced by algal cells and pass them on to their predators, including seabirds and marine mammals.

Scientists refer to these incidents as harmful algal blooms, or HABs.  They are not restricted to the waters of southern California, though.  And Pseudo-nitzschia cells are not the only culprits.  There are a number of other algal species that can cause trouble for local marine ecosystems. HABs occur in coastal waters around the globe, with larger and more frequent blooms being linked to a warming sea surface [1] and also to increased nutrient runoff from land [2], which is a common by-product of animal or plant agriculture.  Local economies suffer due to tourism losses and local residents can’t harvest shellfish from their beaches.

After wrapping up a bachelor’s degree at UCSB, Liz moved on to graduate school at the University of Washington so that she could study harmful algal species in Puget Sound.  One of the species she’s interested in is Alexandrium catanella, another single-celled marine alga that produces a suite of deadly neurotoxins causing paralytic shellfish poisoning, or PSP.  Alexandrium blooms can become so intense that they give the water a reddish-brown tinge, often called a “red tide” (although they don’t always cause water discoloration).


If you eat a shellfish that has accumulated Alexandrium toxins, a series of unpleasant events unfolds. First, your fingers and toes will tingle and your lips will feel numb.  Next, you may become nauseous and unsteady on your feet.  If the toxins hit you at full force, your diaphragm will become paralyzed, which means that you are unable to pull oxygen into your lungs.  Without prolonged artificial respiration, a severe case of PSP can lead to death in a matter of hours.

In recent years, poisoning cases in Washington State have been rare, but they do happen. Washington Department of Health aims to prevent outbreaks by regularly testing shellfish along Puget Sound beaches.  Commercially harvested shellfish are tested exhaustively before hitting the markets. Unfortunately, it’s tough to predict the severity, location, and timing of an Alexandrium bloom ahead of time. That’s where Liz’s work fits in: she’d like to improve our ability to forecast where those blooms will happen ahead of time so that public health officials, fisheries managers, and the shellfish industry have more time to react.


Alexandrium cells have a comfort zone, she explains.  They grow and divide furiously when the days are long and the surface waters are warm.  Once the spring bloom ramps up, zooplankton, fish, and shellfish feed on the algae, eventually decimating Alexandrium populations by early to mid summer.  Less Alexandrium cells means less food for their predators, and their predators begin to die off or find food elsewhere.  By late August, with newly lowered predation pressures and plenty of sunlight, a second bloom typically occurs.  As the season draws to a close, Alexandrium  algal cells feel the effects of shorter days and less sunlight.  They struggle to grow and divide and at some point cut their losses and transit down to the sediment to wait out winter.  -Settled into the muddy seafloor, they shift to survival mode, also known as the “cyst” stage.  The exact conditions that send them into their cyst stage are poorly understood, as are  the details of how they wake up and make their way to the surface come springtime.

Liz is particularly interested in how the Alexandrium cells swim, even though they are mostly at the whim of the currents.  Tides rush in and out twice each day, racing through narrows and washing languidly over mud flats.  Rivers pour into Puget Sound with irregular pulses of freshwater from rain and springtime snow melt. Alexandrium algal cells propel themselves through the water using a small whip-like appendage called a flagellum, but because of their tiny size – only about a third of the width of a single human hair – they can’t overcome even the weakest currents. However, if Liz knows their vertical swimming capabilities, she can better understand how quickly Alexandrium cells migrate from the seafloor to the surface (where blooms occur) and back again.

A lot of researchers, Liz included, study Alexandrium in the lab.  They blend solutions of water and various chemicals to imitate seawater, and observe how the cells behave and react to a variety of environmental triggers.  One of the labs she works in is a tiny bunker-like space on the second floor of the Ocean Teaching Building.  Two long black tables running along either side of the room are covered by a hodge-podge of beakers, elaborate video recording setups, and circuit boards sprouting nests of red and white wires.  At the back of the room, a giant insulated door opens to a fully programmable walk-in refrigerator, where algal cells can be subjected to controlled shifts in temperature and light.

Even at their best, however, lab experiments can’t possibly recreate the many complexities that exist in the natural world.  Liz knew that if she could monitor the emergence of the Alexandrium cells from the sediments she could combine those observations with detailed fluid flow models to better predict where the cells would eventually concentrate.  The only problem was that there were no “off-the-shelf” instruments that were capable of doing what she wanted.  So she set out to build her own.


When she starts describing the instrument she’s working on, I interrupt.  Pushing my notebook and pen across the table, I ask her to sketch her design for me.  She draws a tube a few inches in diameter, sitting upright on the seabed.  On one side of the tube a camera in waterproof housing points into a tiny window to capture any Alexandrium cells swimming past.  The end of a plankton trap sits on top, ready to catch any upward-swimming cells so they can be later compared with the video. She pauses her drawing to look up at me.  “It’s just like what we would do in the lab,” she says, “but instead of a camera looking at a tank, I have a camera looking into a chamber on the seafloor”.

As of mid-July 2013, Liz’s prototype of her seafloor camera was nearly ready to be deployed for testing.  If it works, she’ll try to capture Alexandrium cells emerging from the seafloor just prior to the late-summer bloom.  Her vision for the future, she tells me, is a network of these instruments deployed across Puget Sound, monitoring cyst emergence in real time.  Instead of only knowing about blooms once they are in full swing, they could be predicted days or weeks in advance, avoiding unnecessary closures and potentially saving lives.

[1] Climate change and harmful algal blooms, NCCOS

[2] Anderson, Donald M., Patricia M. Glibert, and Joann M. Burkholder. “Harmful algal blooms and eutrophication: nutrient sources, composition, and consequences.” Estuaries 25, no. 4 (2002): 704-726.