Subscribe: - Blog
Added By: Feedage Forager Feedage Grade B rated
Language: English
bug  forever  function  git  github  jquery  module  modules  new  node modules  node  npm  ssd  time  travis  version  widget 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: - Blog - Blog


Releasing node-mysql 2.0.0-alpha

Tue, 15 May 2012 09:04:24 +0000

Today I am releasing an alpha version of node-mysql v2.0. If you are using v0.9.x at this point, I highly encourage you to try it out, as now is your best chance to influence the API and features of the final release.

To install the new version, do:

npm install mysql@2.0.0-alpha

Then check out the Upgrading Guide and adjust your code as needed.

After that make sure to join the new mailing list or #node-mysql IRC channel to provide any feedback you may have.

This new version comes with a few exciting improvements:

  • ~5x faster than v0.9.x for parsing query results
  • Support for pause() / resume() (for streaming rows)
  • Support for multiple statement queries
  • Support for stored procedures
  • Support for transactions
  • Support for binary columns (as blobs)
  • Consistent & well documented error handling
  • A new Connection class that has well defined semantics (unlike the old Client class).
  • Convenient escaping of objects / arrays that allows for simpler query construction
  • A significantly simpler code base
  • Many bug fixes & other small improvements (Closed 62 out of 66 GitHub issues)

I have been working on this new version for quite some time, but only now was able to complete the final work thanks to my amazing new sponsors:

For those of you interested in the future of v0.9.x:

  • There will be no v1.0 release, the way 0.9.x handles its queue and reconnect mechanism are broken and not easily fixable without breaking BC.
  • I will merge critical bug fixes that come with tests and don't break BC.
  • There will be no major changes, especially features.

Going forward, I also hope to find some time to write about:

  • The new parser and what makes v2 so fast
  • The choice of rewriting vs. refactoring (I tried both)
  • My failed first attempt for a new parser design




How to write jQuery plugins

Thu, 29 Mar 2012 06:41:34 +0000

jQuery, the most popular javascript library out there, is great for DOM abstraction. It allows you to encapsulate functionality into your own plugins, which is a great way to write reusable code. However, jQuery's rules for writing plugins are very loose, which leads to different plugin development practices - some of which are pretty poor. With this article I want to provide a simple plugin development pattern that will work in many situations. If the functionality you would like to encapsulate is large and really complex, jQuery plugins are probably not what you should use in the first place. You'd rather use something like BackboneJS or jQuery.Controller in this case. If you can't or don't want to use Backbone, you might still get away with my solution ... Starting off ;(function($, doc, win) {   "use strict";   // plugin code will come here })(jQuery, document, window); The semi-colon before the function invocation keeps the plugin from breaking if our plugin is concatenated with other scripts that are not closed properly. "use strict"; puts our code into strict mode, which catches some common coding problems by throwing exceptions, prevents/throws errors when relatively "unsafe" actions are taken and disables Javascript features that are confusing or poorly thought out. To read about this in detail, please check ECMAScript 5 Strict Mode, JSON, and More by John Resig. Wrapping the jQuery object into the dollar sign via a closure avoids conflicts with other libraries that also use the dollar sign as an abbreviation. window and document are passed through as local variables rather than as globals, because this speeds up the resolution process and can be more efficiently minified. Invoking our plugin ;(function($, doc, win) {   "use strict";   function Widget(el, opts) {     this.$el  = $(el);     this.opts = opts;     this.init();   }   Widget.prototype.init = function() {   };   $.fn.widget = function(opts) {     return this.each(function() {       new Widget(this, opts);     });   }; })(jQuery, document, window); $('#mywidget').widget({optionA: 'a', optionB: 'b'}); We invoke our plugin on a jQuery object or jQuery set by simply calling our widget() method on it and pass it some options. Never forget about "return this.each(function() { ... })" in order to not break the chain-ability of jQuery objects. The main functionality of the plugin is encapsulated into a separate Widget class, which we instantiate for each member in our jQuery set. Now all functionality is encapsulated in these wrapper objects. The constructor is designed to just keep track of the passed options and the DOM element that the widget was initialized on. You could also keep track of more sub-elements here to avoid having to always .find() them (think of performance) as you need them: ;(function($, doc, win) {   "use strict";   function Widget(el, opts) {     this.$el     = $(el);     this.opts    = opts;     this.$header = this.$el.find('.header');     this.$body   = this.$el.find('.body');     this.init();   }   // ... })(jQuery, document, window); Parsing options When we invoked the plugin we passed it some options. Often you need default options that you want to extend. This is how we bring the two together in our object's init() method: ;(function($, doc, win) {   "use strict";   function Widget(el, opts) {     this.$el  = $(el);     this.defaults = {       optionA: 'so[...]

Vim Workshop in Berlin (April 20)

Mon, 26 Mar 2012 08:56:21 +0000

My friend Drew of Vimcast fame is organizing two half-day vim workshops in Berlin on April 20.

As a former Textmate user, I cannot overstate the productivity gains from mastering vim. With the early bird discount, the tickets sell at 75 GBP (~90 EUR), and there are only a few tickets left, so you should act quickly.

The workshops are aimed at intermediate users, so if your vim skills are non-existing or very rusty, you should probably play with vimtutor before showing up.

I'll be attending the afternoon workshop along with Tim, so hope to see you there!


Full Disclosure: I gain nothing by promoting this event other than the joy of seeing people boost their productivity.



NPM - An intervention

Wed, 22 Feb 2012 17:03:00 +0000

Update: Isaac commented and explained why fuzzy version specifiers are here to stay. I'll be ok with it and will adapt my workflow accordingly. Update 2: I did not give up on the bug that is part of the story below, a test case and fix has been submitted and merged! Update 3: NPM Shrinkwrap is now a real thing. NPM is the official node package manager. Unlike many package managers that came before, it is actually incredibly awesome, and has helped to create one of the most vibrant communities in the history of open source. However, today I want to talk about a few aspects of npm that concern me. In particular I want to talk about stuff where I feel that NPM is making bad things easy, and good things hard. NPM module versions are broken Today, I tried to contribute to the forever module. The company I am helping had to patch their version of it because of a hard-to-reproduce bug in production and asked me to help submitting their fix upstream. Being the scientific type, I set out to write a test case against the forever version their patch is based on: $ npm install forever@0.7.2 Fantastic, NPM lets me specify which version of forever I want to install. Now lets verify the installed version works: $ ./node_modules/forever/bin/forever node:134 throw e; // process.nextTick error, or 'error' event on first tick ^ TypeError: undefined is not a function at CALL_NON_FUNCTION_AS_CONSTRUCTOR (native) at Object. (/Users/Felix/Desktop/foo/node_modules/forever/lib/forever:43:23) ... Oh no, what happened? Mind you, except for an unrelated patch, this version of forever is running perfectly fine in production. Well, as it turns out, you have been lied to. There is no such thing as forever v0.7.2. At least not a single one. It depends on an implicit and unchangable second parameter: time. Why is that? Well, it is because forever v0.7.2 depends on this: "nconf": "0.x.x", And as it turns out, nconf has released newer versions matching this selector, featuring a different API. You are doing it wrong "Hah!", you might say. "That's why you should check your node_modules into git!". I am sorry, but that is not helpful. While this will allow me to pin down the node modules used by my app exactly, it does not help me here. What I want to do is to reproduce this bug in a standalone copy of forever v0.7.2, then check if it exists in the latest version, and if so submit the test case and fix for it upstream. However, I can't. Not without manually resolving all forever dependencies the way NPM resolved them when v0.7.2 was released. (The fact that forever is a bit of a spaceship when it comes to dependencies does not help either). Discouraging Open Source Speaking about Mikeal's article. I felt that something was wrong about checking your node_modules into git when reading it, but it is only now that I can point out what: In the article, Mikeal argues that module authors should not try to exactly reference their dependency versions, so this way users would get more frequent updates of those dependencies and help test them. However, he says doing so for your app is a good thing. I disagree. To me, this approach discourages open source for two reasons: a) Bug reports: I currently maintain 44 NPM modules. It is very hard to keep up with that. If you are asking me to support multiple versions of all my dependencies, I will have to stop helping people with bug reports for my modules. When somebody reports a bug for a given version of my module, I want to know exactly what version he used. Figuring out when he installed my module to rule out dependency issues for every bug report is not an option for me. b) Contributions Ask yourself what is easier. Adding a quick patch to a node module you already track include in the git repo of your app, --or-- creating a fork of it, fixing the problem in the fork, pushing that fork on GitHub, changing your package.json to point to your fork, and [...]

Testing node modules with Travis CI

Fri, 18 Nov 2011 09:17:12 +0000

You have written a node module lately? It has a test suite? Awesome! Time to get yourself a nerd badge of honor:


But hang on nerdy warrior, this precious award has to be earned. So go ahead and check out the sweetness that is Travis CI. Travis is an open source, free to use, continuous integration server. Initially it was just building ruby stuff, but these days it supports a ton of other languages, including node.

And luckily, getting travis to run your tests on every GitHub push is really easy as well:

Step 1: Go to Travis and login/connect with your GitHub account.

Step 2: Hover over your name on the top right, and select "Profile" from the dropdown.

Step 3: You should see all your GitHub projects. Flip the "Off" switch to "On" for a node project you want to use with travis.

Step 4: Add a .travis.yml file to your project with the following:

language: node_js
  - 0.4
  - 0.6

Step 5: Make sure your package.json has something like this:

"scripts": {
    "test": "make test"

Step 6: Git push, and watch travis building your project on the home screen!

Step 7: Assuming your tests are passing, it is time to get your badge of honor. Adding it to your GitHub is as simple as:

[![Build Status](](

If you want to see an example of what this looks like, and you also happen to be in the market for some no-bullshit testing tools, check out my new libs:

  • utest: The minimal unit testing library.
  • urun: The minimal test runner.

That's it. And in case you are not excited enough yet, go and check out the Travis Docs to discover additional goodies like how to work with databases, etc.




Private npm modules

Thu, 08 Sep 2011 14:18:26 +0000

Thanks to Isaac, npm is getting more and more awesome by the hour. One of the coolest recent additions (you need at least v1.0.26) is the ability to specify private git repositories urls as a dependency in your package.json files.

At transloadit, we are currently using the feature to move some of our infrastructure code into separate packages, allowing for those to be tested and developed in isolation making our core application easier to maintain and work on.

The syntax for referencing a git repository (and commit) is as follows:

  "name": "my-app",
  "dependencies": {
    "private-repo": "git+ssh://",

This will include a private npm module called "private-repo" from GitHub. The url also contains an optional refspec (#v0.0.1) that tells npm which branch, commit, or in this case tag you want to have checked out.

Now of course this is not the only way to do private npm repositories, but it is much simpler than running your own registry, so I would recommend it to most people.

Before you head of to play with this, here is a final tip that may safe you some headaches. In all your private npm modules, add "private": true to your package.json. This will make sure npm will never let you accidentally publish your secret sauce to the official npm registry.

Happy hacking, --fg

PS: When deploying, don't forget that you need to authorize the servers ssh key for the GitHub repository you are depending on.



How to fork & patch npm modules

Tue, 26 Jul 2011 13:15:03 +0000

With now more than 3000 modules, there are huge gaps in the quality of things you find in the npm registry. But more often than not, it's easy to find a module that is really close to what you need, except if it wasn't for that one bug or missing feature.

Now depending on who is maintaining this module, you may get the problem resolved by simply opening a GitHub issue and waiting for a few days. However, open source doesn't really work without community, nor do you always want to be at the mercy of someone else. So a much better approach is to actually roll up your sleeves, and fix the problem yourself.

Here is the proper way to do this while using npm to manage your forked version of the module:

  1. Fork the project on GitHub
  2. Clone the fork to your machine
  3. Fix the bug or add the feature you want
  4. Push your commits up to your fork on GitHub
  5. Open your fork on GitHub, and click on the latest commit you made
  6. On the page of that commit, click on the "Downloads" button
  7. Right click on the "Download .tar.gz" button inside the popup, and copy the link ("Copy Link Address" in Chrome)
  8. Open up your package.json file, and replace the version number of the module with the url you just copied
  9. Send a pull request upstream (Optional, but this way you will avoid having to maintain that patch of yours against newer versions of the module you forked)

Example: My new airbrake module uses a forked version of xmlbuilder. I submited my fix as a pull request, but it has not been merged yet. In order to pull in my changes via npm anyway, I simply pointed my package.json to the download url of my fork on GitHub like so:

"dependencies": {
    "request": "1.9.8",
    "xmlbuilder": "",
    "stack-trace": "0.0.5",
    "traverse": "0.4.4",
    "hashish": "0.0.4"

Alright, let me know if this is helping your node adventures, or if you have an alternative workflow you are using. Otherwise, happy hacking!


PS: You should upgrade to the latest npm version first, some older versions had problems with handling url dependencies properly.



Node Workshop in Cologne, June 10th

Fri, 20 May 2011 13:26:03 +0000

We apologize for the short notice, but if you are looking to put node in production, this full day node workshop we are organizing is where it's at!

The workshop is happening on Friday June 10, one day before Space is limited to 15 people and expected to sell out quickly.

As a reader of our blog, you can get a 15% discount on the regular ticket by using the code 'debuggable'.

Should you attend?

This workshop will teach you everything you need in order to write and deploy powerful node applications. We'll try to cover a lot of ground, so if you are interested in any of the following, you should definitly attend:

  • Setting up node on your local machine
  • Understanding the module system
  • Using npm for installing and upgrading modules
  • Publishing your own npm modules
  • Everything you need to know about http.Server
  • Structuring your code using OOP in JavaScript
  • Dealing with all the callbacks in a sane fashion
  • Using the same code in node and the browser
  • Building realtime apps with Socket.IO
  • Using the express framework
  • An overview over testing tools available for node
  • Deploying node to Ec2 / Joyent

The first half of the day will be guided by slides (which will be made available afterwards), with the second half being a hands-on session where we will build a small node app from scratch.

About the instructor

This workshop will be led by Felix Geisendörfer (that's me). He is one of the earliest and most active contributors to node, author of over 20 npm modules and also running one of the biggest node applications in production over at

In addition, Tim Koschützki who is also a co-founder at transloadit, will be available all day to help with individual questions and trouble shooting.

Questions & More Workshop

If you have additional questions or can't make it to this workshop, please head over to the workshop page which has information on other upcoming workshops and questions.

--fg (image)


Why are you not using a SSD yet?

Wed, 23 Feb 2011 22:15:01 +0000

If you are a developer, and you have not switched to a SSD yet, what is your excuse? Let me explain. I've switched to an SSD a little over a week ago, and it's a different world. You know that feeling of having just bought & setup a new machine and everything still runs very fast? Well, a SSD will make every single day feel just like that, except much faster. But I already knew that, so why has it taken me, and apparently you who is reading this, so long? Well, my main problem was that I have a few big things on my hard disk, namely music, photos and virtual machine images. This means that I need a hard disk of ~300 GB to work comfortably. However, the SSD I was interested in only comes in 40, 60, 120, 240 and 480 GB. The 480 GB costs ~$1.580 right now. A 240 GB SSD costs ~$520 which seems much less outrageous, but unfortunately that's still too small if it was my only disk. So for a while, I thought I'd have to wait another 1 - 2 years before enjoying the SSD experience. That was until I came across this article which explained that you could replace your MacBook Pro's optical drive with an SSD. This means I could add an SSD to my machine without giving up the luxury of cheap mass storage. With this in mind, I decided to get a 120 GB SSD, which is plenty of space for my core system and applications. I followed a few youtube videos for swapping out the disks, and I also placed my previous hdd in the optical bay slot since I've heard reports of hibernation problems if you put your primary disk there. Making the new SSD my primary hard disk was easy as well. My initial attempt using time machine failed, so I simply booted up my system from the old primary hdd, and used carbon copy cloner to copy all data (excluding my music, images and vms) to my new SSD. After that I made the SSD my primary boot disk using the "Startup Disk Preference Pane" and rebooted. The whole operation took about 1-2 hours. So how has this changed my life? First of all, boot time is incredible. Compared to Tim's mac (which is now scheduled for an upgrade ASAP as well), my machine goes from 0 to starting Photoshop in 48 seconds. Tim's machine takes 2 minutes and 50 seconds. Note: It takes about the same time for both machines to boot the kernel, but my machine is instantly ready at that point now. Starting programs is either instant or 2-3 times faster than before. Recursive grep (using ack) is insanely fast / instant, even on big project. And git - it's a different world. If you've ever waited for minutes while running 'git gc' on a big project, an SSD turns this to seconds. Everything feels just incredibly fast. With this in mind, what's your excuse for not treating yourself to a SSD now? --fg PS: If you think you would miss your optical drive: You can get an external USB one for ~$40 on Amazon. If you really need the internal one back, I guess it would take you about ~10-15 minutes to put it back in once you know the procedure. PPS: If you're worried about the difficulty of replacing the disk: It's very easy, all you need to know is how to operate a screw driver. However, make sure you've got the right tools. The OWC disk I'm recommending comes with a set of tools if you order it with the data doubler for the optical bay. PPPS: My friend Joel pointed out the lack of TRIM support in OSX as a reason for not getting an SSD yet. That's a valid argument, but the OWC discs do not suffer from the lack of TRIM. [...]

Talks, talks, talks

Fri, 18 Feb 2011 15:58:33 +0000

I've been in Atlanta for the past two weeks, and thanks to the kind help of a few folks, I was able to present at 2 meetups, as well as Startup Riot 2011. First up was a new talk at the Atlanta Ruby Meetup: Nodejs - Should Ruby Developers Care? The talk was an attempt of taking out all the cool-aid and hype and focusing on node's true strength as well as weaknesses. I think the whole thing was very well-received, but I certainly could have done a little better on the delivery. Download: nodejs-should-ruby-developers-care.pdf (733 KB) Next up was the 4th edition of my general introduction talk to node: Nodejs - A quick tour (v4) This version of the talk was updated for the freshly released v0.4, and I've also tweaked some other slides to the point where I'm very happy with it. It seems to do a great job getting people excited about node, as well as highlighting sensible use cases. Download: nodejs-a-quick-tour-v4.pdf (610 KB) And last but not least, I had the chance to do a 3 minute pitch for Transloadit at Startup Riot: Transloadit at Startupriot Download: transloadit-startupriot.pdf (640 KB) Doing the actual presentation was quite scary. I have never given a talk this short, you basically don't get any time to warm up and get into things. You got to go out, and give your best right away. Having an audience of ~500 people didn't help either. However, I think I pulled it of fairly well. My main message was: "Save the time, save the money, save the shrink - use transloadit", and I highlighted some of the cooler aspects of our service such as the realtime encoding. Lots of people came by to our table afterwards to find out more about the service, including a few VCs and angels (we're not looking for investment right now, but seeing their interest feels good regardless : ). There are a few more talks coming up in the next couple of months, but I also hope to find some more time for actual blogging again. I certainly want to start writing a few articles about testing JavaScript. --fg [...]