Tuesday, August 04, 2015

Hacking Zebrafish thoughts

The last lab from Scalable Machine Learning with Spark features a guest lecture by Jeremy Freeman, a professor of neuroscience at Janelia Farm Research Campus.

His group produced this gorgeous video of a living zebrafish brain. Little fish thoughts sparkle away, made visible by a technique called light-sheet flourescent microscopy in which engineered proteins that light up when the neurons fire are engineered into the fish.

The lab covers principal component analysis in a lively way. Principal components are extracted from time-series data and mapped onto an HSV color wheel and used to color an image of the zebrafish brain. In the process, we use some fun matrix manipulation to aggregate the time series data in two different ways - by time relative to the start of a visual stimulus and by the directionality of the stimulus (shown below).

The whole series of labs from the Spark classes was nicely done, but this was an especially fun way to finish it out.

Check out the Freeman Lab's papers:

Tuesday, July 21, 2015

Machine learning on music data

The 3rd lab from Scalable Machine Learning with Spark has you predict the year a song was published based on features from the Million Song Dataset. How much farther could you take machine analysis of music? Music has so much structure that's so apparent to our ears. Wouldn't it be cool to be able to parse out that structure algorithmically? Turns out, you can.

Apparently The International Society for Music Information Retrieval (ISMIR) is the place to go for this sort of thing. A few papers, based on minutes of rigorous research (aka random googling):

In addition to inferring a song's internal structure, you might want to relate it's acoustic features to styles, moods or time periods (as we did in the lab). For that, you'll want music metadata from sources like:

There's a paper on The Million Song Dataset paper by two researchers at Columbia's EE department and two more at the Echo Nest.

Even Google is interested in the topic: Sharing Learned Latent Representations For Music Audio Classification And Similarity.

Tangentially related, a group out of Cambridge and Stanford say Musical Preferences are Linked to Cognitive Styles. I fear what my musical tastes would reveal about my warped cognitive style.

Wednesday, July 08, 2015

Scalable Machine Learning with Spark class on edX

Introduction to Big Data with Apache Spark is an online class hosted on edX that just finished. Its follow-up Scalable Machine Learning with Spark just got started.

If you want to learn Spark - and who doesn't? - sign up.

Spark is a successor to Hadoop that comes out of the AMPLab at Berkeley. It's faster for many operations due to keeping data in memory, and the programming model feels more flexible in comparison to Hadoops rigid framework. The AMPLab provides a suite of related tools including support for machine learning, graphs, SQL and streaming. While Hadoop is most at home with batch processing, Spark is a little better suited to interactive work.

The first class was quick and easy, covering Spark and RDDs through PySpark. No brain stretching on the order of Daphne Koller's Probabilistic Graphical Models to be found here. The lectures stuck to the "applied" aspects, but that's OK. You can always hit the papers to go deeper. The labs were fun and effective at getting you up to speed:

Labs for the first class:

  • Word count, the hello world of map-reduce
  • Analysis of web server log files
  • Entity resolution using a bag-of-words approach
  • Collaborative filtering on a movie ratings database. Apparently, I should watch these: Seven Samurai, Annie Hall, Akira, Stop Making Sense, Chungking Express.

The second installment looks to very cool, delving deeper into mllib the AMPLab's machine learning library for Spark. Its labs cover:

  • Musicology: predict the release year of a song given a set of audio features
  • Prediction of click-through rates
  • Neuroimaging Analysis on brain activity of zebrafish (which I suspect is the phase "Just keep swimming" over and over) done in collaboration with Jeremy Freeman of the Janelia Research Campus.

The labs for both classes are authored as IPython notebooks in the amazingly cool Jupyter framework where prose, graphics and executable code fit combine to make a really nice learning environment.

Echoing my own digital hoarder tendencies, the first course is liberally peppered with links, which I've dutifully culled and categorized for your clicking compulsion:

Big Data Hype


Data Cleaning



The Data Science Process

In case you're still wondering what data scientists actually do, here it is according to...

Jim Gray

  • Capture
  • Curate
  • Communicate

Ben Fry

  • Acquire
  • Parse
  • Filter
  • Mine
  • Represent
  • Refine
  • Interact

Jeff Hammerbacher

  • Identify problem
  • Intrumenting data sources
  • Collect data
  • Prepare data (integrate, transform, clean, filter, aggregate)
  • Build model
  • Evaluate model
  • Communicate results

...and don't forget: Jeffrey Leek and Hadley Wickham.

Tuesday, June 02, 2015

Beyond PEP 8 -- Best practices for beautiful intelligible code

I didn't really mean to become a Python programmer. I was on my way to something with a little more rocket-science feel. R, Scala, Haskell, maybe. But, since I'm here, I may as well learn something about how to do it right. In this respect, I've become a fan of Raymond Hettinger.

Python coders will enjoy and benefit from Raymond's excellent talk given at PyCon 2015 about Python style, Beyond PEP 8 -- Best practices for beautiful intelligible code.

"Who should PEP-8-ify code? The author. PEP 8 unto thyself not unto others."

To Hettinger, PEP-8 is not a weapon for bludgeoning rival developers into submission. Going beyond PEP 8 is about paying attention to the stuff that really matters - using languages features like magic methods, properties, iterators and context managers. Business logic should be clear and float to the top. In short, writing beautiful idiomatic Pythonic code.

There are plenty more videos from PyCon 2015 where that one came from.

Monday, March 09, 2015

Extended Lake Union Loop

The standard running loop around Lake Union is a touch over 6 miles. With the addition of a side loop around Portage Bay, you can bring it up to 8 and a half, taking in a bit of UW's campus and crossing over the cut into Montlake. Sticking to the water's edge keeps the terrain nice and flat, but if you want some climbing, head up into Capitol Hill via Interlaken park.

Here, I've factored in a stop at PCC for a cold drink.

Tuesday, January 27, 2015

Haskell class wrap-up

[From the old-posts-that-I've-sat-on-for-entirely-too-long-for-no-apparent-reason department...]

Back in December, I finished FP101x, Introduction to Functional Programming. I'm stoked that I finally learned me a (little) Haskell, after wanting to get around to it for so long.

The first part of the course was very straight-forward covering the basics of programming in the functional style. But the difficulty ramped up quickly.

A couple of labs were particularly mind-bending, not just for me judging by the message boards. Both were based on Functional Pearl papers and featured monads prominantly. The first was on monad parser combinators and the second was based on A Poor Man's Concurrency Monad. Combining concurrency (of a simple kind), monads and continuation passing is a lot to throw at people at once.

The abrupt shift to more challenging material is part of a philosophy of "teaching the students to fish for themselves". So is introducing new material in the labs rather than in the lectures. This style of teaching alienated a number of students. It's not my favorite, but I can roll with it.

Just be aware that the course requires some self-directed additional reading and don't flail around trying to solve to homeworks without sufficient information.

More Haskell

Now that the class is over, I'd like to find time to continue learning Haskell:

One reason I wanted to learn Haskell is to be able to read some of the Haskell-ish parts of the programming languages literature:

Monday, January 12, 2015

Brave Genius

Brave Genius is an unlikely dual biography of a biologist and a writer who shared a friendship and a common philosophy. Both were active in the French resistance to the German Occupation and both would later receive a Nobel prize. Sean B. Carroll forges an inspiring story from seemingly incongruous elements: the desperate defiance of a few in an occupied country, the exhilarating pursuit of an open scientific question, and a lonely stand on the moral high ground.

In 1940, Jacques Monod was a newly married father of twins and a researcher at the Sorbonne. Albert Camus, having already published a couple of books of essays, departed his native Algeria for France in March of that year to find work.

On May 10 1940, German troops crossed into Holland and Belgium. Panzers raced towards the Atlantic coast severing Allied lines and stranding French and British troops in the low countries. French defenses collapsed and Germans arrived in an undefended Paris on June 14. The armistice signed on June 22nd marked the beginning of four years of occupation.

During those years, Camus edited and wrote for the underground newspaper Combat urging resistance to the occupation. As the tide of the war turned, Monod organized sabotage attacks and armed resistance ahead of the approaching liberators.

“I have always believed that if people who placed their hopes in the human condition were mad, those who despaired of events were cowards. Henceforth, there will be only one honorable choice: to wager everything on the belief that in the end words will prove stronger than bullets.” Camus, Combat (November 30, 1946)

François Jacob, André Lwoff and Jacques Monod were awarded a Nobel prize in 1965 for their work on the control of gene expression, elucidating the regulation of the lac operon by which bacteria switch on metabolism of the sugar lactose.

In his writing, Camus confronts the absurdity of the human search for clarity and meaning in a world that offers only indifference. The attempt to derive meaning and morality without resort to mysticism links Camus's philosophy to Monod's scientific work, which provided some of the first direct evidence that life is mechanistic rather than the result of some magical "vital force" and that its workings could be understood.

“The scientific approach reveals to Man that he is an accident, almost a stranger in the universe.” Monod, in On Values in the Age of Science (1969)

“One of the great problems of philosophy, is the relationship between the realm of knowledge and the realm of values. Knowledge is what is; values are what ought to be. I would say that all traditional philosophies up to and including Marxism have tried to derive the 'ought' from the 'is.' My point of view is that this is impossible.” Monod

Carroll, a biologist himself, embeds philosophy and science into the personal lives of his protagonists and the geopolitical events unfolding around them. Both men did brilliant work in the darkest of times, and did so not by retreating but by fully engaging at great risk with the struggles that faced them. The book serves as a warning of what happens when good people overlook the malfeasance of their leaders, but also as confirmation of the resilience of intellect, creativity and humanity.