Looking for a job
The time has finally come.
I’m looking for a job!
After ±5 years of hard work at Hasselt University, I will graduate as a master in computer science next month. I finished my master thesis and courses in June 2011 and have just completed my internship at Facebook a few weeks ago (on December 16). I’ve received an awesome job offer to work full-time at Facebook.
But my super-duper awesome girlfriend, Anneleen, is studying medicine here in Belgium. If she’d continue to study medicine in the U.S., she’d have to start all over, so that’s not really an option (not to mention the ridiculous costs). This summer, we’ll move in together in a (yet to be found) apartment in Leuven, Belgium.
Also, I just like Europe better than the United States.
I’ve already talked to several companies, months ago and more recently, but since there are so many interesting companies, projects and challenges out there, I decided to write this blog post.
My main interests (and areas of expertise) are:
- WPO (Web Performance Optimization): making websites faster
- Drupal
- data mining
Want to talk to me? Contact me at http://wimleers.com/contact.
Want to know more? High-level background story below. See my resume (also PDF) and my LinkedIn profile for additional information.
Background story
Since September 2006, I’ve been studying Computer Science.
The open source project I started in high school — driverpacks.net — was open sourced at the end of 2006 and is still thriving today (with almost 3,000,000 downloads in 2011).
That project is how I got involved with Drupal, about which I’ve written many times now, and to which I’ve made numerous contributions. In fact, my five-year Drupal anniversary was just 6 days ago!
My most popular module is Hierarchical Select, which is used by 14,400 Drupal sites today. I made 696 commits to that module, out of a total of 1013 commits to Drupal contrib.
I’ve spoken at several DrupalCons and have worked directly for Dries at Mollom.
In December 2007, I first encountered [Steve Souders’ 14 rules for speeding up websites]. Being frustrated by slow websites, I was very interested in this. So I spent the Christmas vacation looking into how Drupal was doing on these 14 aspects. That’s how the apparently still kind-of-used-as-a-reference article “Improving Drupal’s page loading performance” came into existence1. Simultaneously, I developed an initial version of the CDN module.
Building upon this knowledge, I wrote up a proposal to do my bachelor thesis on improving Drupal’s page loading performance. It was accepted. The result was File Conveyor, a daemon written in Python to discover (using inotify
and FSEvents
), process (optimizing JS/CSS/images, lossless image optimization …) and sync files (to Amazon S3/CloudFront, (S)FTP servers, Origin-Pull CDNs …).
Besides File Conveyor, I also wrote a new version of the CDN module for Drupal 62, as well as an Episodes module, which measures the real-world page loading performance, along with another Drupal module to analyze the collected measurements. A full two years before New Relic added “Real User Monitoring” using the same Episodes library!
The Drupal module that analyzed the collected measurements was just a proof-of-concept: it didn’t scale at all. In my first master year, I had a data mining course. The possibilities of data mining intrigued me and I connected the dots: from Drupal to WPO to data mining.
That’s why I proposed to do my master thesis on Web Performance Optimization Analytics. The basic premise: build the Google Analytics for WPO. By using frequent pattern mining, automatically find which parts of web pages are slow in which context (browser, connection type, location …).
This proposal was also accepted. While still developing my master thesis, I was contacted by Facebook.
I passed Facebook’s technical interviews and then flew out to Silicon Valley in September 2011. There, I was part of the Site Speed team. I worked on “regular Site Speed team stuff”, but most of my time was devoted to my intern project.
My intern project was basically about making my master thesis useful for Facebook: making it more generic and integrating it with existing internal Facebook tools, to detect patterns in performance data.
It is currently running in production. It’s used by the Site Speed team for detecting performance problems and will be used by two other teams.
There are currently five pattern mining jobs that are mining data streams. The biggest job analyzes 17 million samples per day, but splits each sample into 5 separate ones so that’s about 85 million per day — that’s almost 1,000 per second. Per sample, 10—11 attributes are analyzed, so that’s about 900 million attributes analyzed per day. And that’s just one of the five jobs.
Now, in January 2012, I’m looking for a job!
That’s the story I have to tell up to this point. As you can tell, I try to chain areas of interest together. This allows for a higher impact. I don’t quite understand myself how I did it, but I somehow managed to tie all of this together.
I hope the next chapters of my life will be even more fun! :)
-
The term “WPO” did not exist yet! ↩
-
This module shipped with the same (but backported, of course) Drupal core patch as I had contributed to Drupal 7 core for my bachelor thesis. ↩
- Drupal
- Hasselt University
- WPO
- data mining
- life