Learning HTML, CSS, and PHP

I can’t tell you how often I find myself wishing I could design my own web pages.  Or even tweak things like WordPress templates.  It’s a skill that I’ve wanted to work on for a long time now.

I actually started digging into this almost a year ago, when my sister and her husband visited, although things have been busy and I have abandoned it for some time now. Christian is really into web programming, and is also quite patient when it comes to sharing his knowledge. He suggested checking out a couple of books to get started:

1.  CSS: The missing manual

2.  PHP and MySQL for Dynamic Web Sites

3.  Javascript and jquery

I’ve started with the first two, just working through the examples.  I don’t think of myself as a total tech dummy, but I have to admit – I have very little knowledge of the inner workings of websites.  It’s actually a lot of fun to go through these books and see what’s going on in the background – so complicated, but so logical.

A couple of other things to mention:  Although it would be pretty awesome, I can’t actually afford the Adobe Design and Web premium (even though it’s on massive discount for students – I will have to get it before I graduate!)  In the meantime, I’m happy to explore what my options are in the land of open source.  I found what seems to be a great alternative to Dreamweaver – it’s called Aptana, and it’s totally free!  It also directly links up to Firefox’s Firebug debugger, which is handy.  Of course, never having used Dreamweaver, it’s tough for me to give a fair comparison.  But it’s working just fine for me now.

Adventures in ObsPy

I finally got around to trying out ObsPy… well, okay, I am still trying to figure out how to install it properly. Finally made some headway though. I followed the instructions on the ObsPy installation page.  I thought everything was going smoothly.  For starters, I had all of the dependencies – most importantly Python, SciPy, NumPy, and Matplotlib.  I also had the Easy_install package already from when I was installing Mercurial.

The installations all seemed to go just fine, I was getting messages telling me it was all good.  But then it just didn’t work.  I opened up iPython, tried to give it a whirl, and it just didn’t recognize those packages at all.  Turns out the default installation location was not actually in the Python path.  It sounds like it should be pretty easy to fix, but it took me a long time to figure it out.

So, normally, if you want to import from a Python library, you would use something like this:

from numpy import *

But when I tried importing obspy.core, like this:

from obspy.core import read

it couldn’t find the library. After much searching, I finally figured out how to add to my python path. First, say in iPython, import sys, and check the path:

import sys

This will list what’s in the path. So for me, it was:


To add to that path, I just used:


And it worked! Although I’m still not sure how to change it permanently. But until I figure that out, at least I can play with ObsPy.

53 New Posts

So, you might have noticed (okay, probably not) that I am now on post number 159.  Yup, that’s a big jump from the 100-post mark a few days ago.  How did I do that, you ask?  Well, for one, I finally realized that I could still import posts from my original WordPress.com blog (duh.  I’ve even done it before).  So I added those 53 posts from late 2009/early 2010 – sweet!  Now you can search back through the glorious dorkiness of my pre-grad school days.  Well, actually my between grad school days.  I used to be so into linux and python.  What happened?  I am going to make an effort to get back into it.  What better time then summer, during the few sunny days we get here in Seattle.

Kim was asking me lots of questions about Python and Linux this afternoon before she left.  It made me realize just how much I’ve forgotten (or how little I knew to begin with).  And it was a good kick in the butt to remind me to get back to learning about, and embracing, the world of Ubuntu and Numpy and Matplotlib…  Nerdy posts will resume!  But since a few people very dear to me enjoy my daily sketches, I will endeavor to continue those as well.  (I love you guys!)

Streamlines and statistics

I finally, finally downloaded R today.  I’ve been meaning to check it out for ages.  I only spent about 10 minutes on it once I had it downloaded though – things are too busy to be playing with a new toy.  Next week is finals week.  Probably best to work on the things that have deadlines in the next 7 days.  Particularly those things which require me to give presentations in front of people.  Turns out that the prospect of public humiliation is a pretty decent motivator, actually.  I’ve been working away pretty diligently on getting those projects finished.

Late-night acoustic propagation modeling

The ASA conference session yesterday on acoustic propagation modeling really sparked my interest.  And since I can’t sleep right now, I decided to download Bellhop, and play around with it.  It’s been a long time since I’ve used Bellhop, so I am sort of re-learning the input file format.

I’m working with the Matlab wrapper for bellhop.  It’s not as fast as running the fortran code directly, but it’s easier to configure.  Just for fun, I grabbed the sample Munk sound speed profile, and chopped it off at 2000 meters depth.  I set up the .env file to compute the transmission loss over ranges up to 20 km away.

Here are those results, for 20hz.

I picked this range and depth and frequency to get an idea of the losses that might happen to a fin whale call near our ocean bottom seismometer network.  Of course, this it’s completely inaccurate – I don’t have the right sound speed profile or bathymetry.  Also, I think this model approximates the transmission losses for an infinite CW signal.  Which is not a very realistic approximation of a fin whale call.

Next up:  a better sound speed profile and maybe some bathymetry information.

LaTex and JASA template

Today Dax and I are figuring out how to use LaTex to put together a JASA paper.  Yay!  So fun – I love figuring out LaTex stuff.  First off, we had to get Dax’s computer all set up to run LaTex.  We went with the MacTex-2010 distribution, it’s pretty much got all you need, and you don’t need to worry about getting the right dependencies or any of that stuff.  It installs all of this stuff:

We haven’t had a chance to investigate much – so far we’re just running TeXShop – it’s pretty straightforward, and does syntax highlighting automatically.  BibDesk for the references is fine too, but I prefer JabRef.  And as it turns out, JabRef was the easiest way to get Dax’s Endnote database converted to BibTex .bib format.

To do the Endnote –> BibTex conversion, we had to follow these steps (after http://wiki.lyx.org/BibTeX/Programs):

  1. In Endnote:  Edit -> Output styles -> Open Style Manager
  2. Check the box marked “Refer Export”
  3. Go to File -> Export.  Save file as type:  “Text only”, Output style: “Refer Export”.
  4. In JabRef:  File -> Import into new database
  5. Choose File Format “Refer/Endnote”, and select exported Endnote .txt file.
  6. In the intermediate viewer window, you can optionally select not to import duplicates, or select which entries you would like to import.  Click OK, and you’re done!

Easy Peasy!

Now for the JASA format.  I found this zip file on the JASA website under For Authors -> JASA.  The second to last option in the first section is Download files for preparation of JASA manuscripts in TeX format. This zip file contains a folder that contains all sorts of JASA/LaTex goodness. I didn’t have a chance to dig in too deep, but I sort of went through and grabbed the bits that I thought relevent, at least to have a first go at it. This meant the template file, the jasatex.cls file, the jasanum.bst file.  There are instructions for installing “JasaTex”, but from what I can tell, it looks like it’s just instructions on getting the style files into the right folders in the Tex system sub-directories.

This was enough to get started, and Dax is filling in his LaTex document with an early draft of his paper – and it looks great! The JASA-LaTex package also includes a handy guide that discusses some LaTex basics and general guidelines.  Very handy.

Learning about audio recording

I realize this is a very strange thing become interested in, but I really want to learn about audio recording. Why? Well, what sparked it was listening to lots of NPR podcasts, and eventually stumbling across one that was done by someone who had no experience in recording. They mentioned that he had gotten started with a website called transom.org. I checked it out, and I’ve been somewhat obsessed with the whole concept since then… not because I want to actually put anything on the radio. It’s more personal than that – I’ve been wanting to write my dad’s life story for ages, but have been completely daunted by it. So this gives me a chance to try to tackle it using a different medium.

My dad has had an interesting life – beginning in a tiny farming village in upper Austria, going through World War II as a child, moving to America, traveling around the world, finally settling down and starting a business, and starting a family later in life. He’ll be 80 this April, and since that’s a significant birthday, I thought it was a great time to get it all down.  I started with an hour and a half-long interview this Saturday, spent several hours on Saturday and Sunday figuring out the software, cutting, editing, re-arranging… and have about 15 minutes of audio story at this point.  So far we’ve just covered his childhood, and I’m still not quite done putting it together.  I’m guessing I’ll have another 10-15 minutes by the time this section is done.

Thanks to Transom.org, I found some really simple audio editing software called Hindenburg Journalist.  It was really easy to learn, and not too terribly expensive.  I figured that the $65 was worth it… certainly better than the $250-350 I’d be looking at for Pro-Tools.

As part of the research involved in learning all of this, I stumbled across a few sites that I thought were helpful:

50 Questions for Family History Interviews

How to Interview a Relative

Audio Interview Tips

This site if for teens (I wonder what would have happened if I stumbled across this as a teen)… but since I know basically nothing about audio recording and documentaries, the basics are just right:  Radio Diaries

I should also mention that in my scouring for resources, I came across Audacity: open source (FREE!), cross-platform, sound editing and recording software.  I may have to give it a go at some point, since I am a fan of all things open-source.

Collaborative Ocean Visualization Environment (COVE)

I love finding exciting new software! And I have a very special place in my heart for open-source software. Alright, this isn’t open source quite yet – it’s actually someone’s PhD project. (I don’t actually know his name, or I’d give him credit here – I’ll try to track it down). So what is it? The quickest description I heard was from Mark Stoermer, who described it as “google earth for the oceans”. This description really doesn’t give it enough justice – it is much more than that. From what I can tell (not having tried it out yet), it has some overlap with other software suites, like Fledermaus, ArcMap, and especially EonFusion. I think that for my purposes, I could most likely use one of the commercially available options, with no loss in the quality of the final product, but the BIG difference? The price. But don’t let that fool you into thinking it’s cheap-o, low quality. I saw some demo’s, and it looks every bit as professional as any viz software I’ve used.

Here’s a little taste, from the COVE website:

Dax is soon going to be working with Mark Stoermer to get the whale tracks into this software.  I can’t wait to see how that looks!