Sometimes I really enjoy setting out on a mission to find a solution to a problem that I have. It’s like an adventure. You do your research, there’s trial and error, and eventually you find your solution and you get that nice kick from it. This past week I was looking into backup software since the situation at work was getting out of control. I was charged with looking up backups for our workstations as well but the solution I found whilst doing that could be applied to our development servers as well.

The software I found is called Crashplan and is really effing neat. You can backup to an external drive, to a computer on your own network, to a friend somewhere else on the Internet or to the cloud. The first three of these alternatives are free and the cloud solution is something you pay for. The latter might be interesting for your average computer user and costs only 60 bucks a year for unlimited amount of data. So check it out! And don’t be fooled by the horrendous homepage.

What was interesting in my case was that I was using a method where I every night archived our WWW repo, database data and configuration files for our one development server. And I didn’t do this incrementally which meant that every night an additional 4-8GB of data got added to my server here at home, to which I backup. Over time I archive the backups to an external drive but in the end it’s not very efficient since after about 250 days I’m using 1TB on that external drive. I realized that I needed a solution that backed up incrementally but still let me have versions/snapshots of the data so that I could restore files from a certain point in time. It also needed to support several computers since we are now using more than one server.

Crashplan can do all of this and it’s really easy to set up as well. When you install the software you get a backup engine running in the background and a GUI that lets you configure that engine. For instance what you want to backup, how often, how many versions to keep, what versions to keep after one week, one month, one year and so on. It also has data de-duplication, meaning that it won’t duplicate a file if it changes, only the part that changes. Awesome! You can even install it on headless servers, servers without a GUI. What you do is you install it and then you remotely connect to that engine using another GUI. It takes some fiddling with configuration files and possibly an SSH tunnel but it’s all worth it.

In my case I set up one account for my server here at home, and that backup engine only acts as a receiving server. For the servers that I want to backup I use another account and that account I bind to my computer here at home, using the “backup to friend” feature. The app takes care of all the port mapping and stuff as long as you have UPnP enabled on your router/gateway and then it all works. Now both of my servers are backing up to my home server and I can easily restore these files with the Crashplan GUI from another computer.

So, all in all, both thumbs up! If you don’t want to use Crashplan and you have a Linux server that you need to do backups on it might be worth checking out rsnapshot. It does the incremental backup thingy as well and it supposedly works really well. Haven’t tried it out yet, and not sure that I will given how well Crashplan works. I guess the only thing that’s bad about it is that you’re dependent on their service, even though you don’t pay anything.