Skip to main content

Developing and deploying applications with Fabric and Subversion

November 6, 2008

Twenty-four hours ago, I was deploying applications from development to production environments in about 5-6 steps. Today, tomorrow and every day in the future, I'm doing it in 1 step, with Fabric.

Fabric is:

...a simple pythonic remote deployment tool. It is designed to upload files to, and run shell commands on, a number of servers in parallel or serially. These commands are grouped in tasks (regular python functions) and specified in a 'fabfile.' It is a bit like a dumbed down Capistrano, except it's in Python, doesn't expect you to be deploying Rails applications, and the 'put' command works. Unlike Capistrano, Fabric wants to stay small, light, easy to change and not bound to any specific framework.

It is awesome. But don't let me tell you, let me show you.

We have lots of projects floating all over the place, all neat and tidy in our Subversion repository. When I'm ready to start building a new feature or fix a bug, I like to have a copy of the production database for that application on my local machine for development. Most of the time, these applications are Drupal or Django based. When I'm ready to start building, I do something like this:

The shell script I run essentially does this:

  • log into the remote production server
  • take a snapshot of the production database
  • save the dumpfile
  • log out of the server
  • transfer the file from the production server to my workstation
  • remove the existing development database from my local MySQL installation
  • create a new database to contain the production database
  • import the production database into the new database

It's nice that the script takes care of all of that for me - but, you see, there are 10-20 of those files (one for each project). This becomes an enormous headache when we change servers or add a new feature to the script. We needed a better solution.

Enter Fabric. I got it up and running in about 5 minutes, and began to migrate that shell script into one 'fabfile'. However, since I needed to use exactly the same functions for every project, we needed to share the functions with all projects. Fabric makes this easy.

In each project root, there is a file named 'fabfile.py', with the following contents:

Important note: The reason I use config.fab_hosts instead of 'set(fab_hosts = ['...'])' is because I've built my Fabric installation from the git master branch. If you've downloaded the 0.0.9 package, use:

The fabfile sets some basic variables for the server we're connecting to, the user we connect with, and MySQL credentials. The very first thing the file does is import a file named 'fabric_global.py', which contains the following:

At first glance, it may look a little confusing, but this code is something like 10% of what it would be if it were duplicated for each project (and with all the additional commands).

The 'fabric_global.py' file defines two functions that we can run on a codebase. One is for updating the development database with the production database, and the other is for simply committing file changes and updating them on the server.

Now, when I need to grab the production database at the beginning a project, I simply do this:

Well crap, that's a lot easier.

When it's time to deploy my code changes, I simply do:

If I had already committed my changes and simply wanted to update them on the production server, I would just do:

Quite amazing, if you ask me.

Also, Fabric can do a whole lot more than what I just demonstrated, so checkout the docs.

I'd like to thank the Fabric team for probably preserving a few years of my lifespan. Also, it should be noted that the current version of Fabric is 0.0.9, which should give you an idea of the amount of awesomeness to eventually come to newer releases.