Wednesday, January 16, 2008

Backup Strategies


You've got data. Tons of data, more and more each day. The stuff piles up faster than a group of Hazzard County police cars chasing the Duke boys. Media files, project files, graphic files...how do you make sure you don't lose data, especially in the midst of an important project? Trust me...nothing torpedoes your credibility like having to call a client to tell them their project is hosed because your hard drive crashed. Larger clients, in particular, expect you to have data backup plans. It's expected.

So what to do? You're a small shop, perhaps even a one-person operation, and it's simply not cost-effective for you to install a server farm with automated backup, not to mention an I.T. department to keep it all running. But you still want to protect yourself from data loss. You need a backup plan.

Here's what we do at Pixel Workshop. We have a computer on our network thats dedicated to being our file server. Nothing fancy, in fact, it's a Powermac G4, running the standard, non-server version of Mac OS X. On this machine we have an external firewire drive, creatively called "Server Storage." This 400 gig drive contains active project files, and is not a long-term archival area. Copies of stuff we're currently working on goes there. (Video media is not stored here - that stays on the edit suite's local, high speed drives.) We're small enough that we don't need sophisticated check-in/check-out accounting for files. As long as we all communicate it's not hard to keep up with versioning.

Every night at 2am Server Storage gets backed up to a drive (also on the server) called (wait for it!) Server Storage Backup. So far, so good, right? We've got two copies of everything, and it's being backed up automagically. Well, kind of. This kind of thing works pretty well, but it doesn't really protect you against the most dangerous enemy of protecting your files. Yes, friends, I'm talking about (cue the scary music) Operator Error.

You accidentally overwrite a folder, delete some files, or save the wrong version, and your backup software dutifully reproduces your mistake to the backup drive, unless you catch the error before the scheduled backup. Now you're doubly screwed! This has happened to me. It's a sinking feeling to haughtily declare, "No problemo! I have a backup..." only to discover your backup is just as hosed as the original. This is also why I don't rely exclusively on Raid-5 solutions. Yes, you're protected from hardware failure, but it doesn't save your from yourself when it's the middle of the night, you haven't slept in days, and you accidentally overwrite the main project file.

For all these reasons we employ a second backup drive, called "Server Storage Weekend Backup." This drive backs up Server Storage every weekend, in the middle of the night. This backup may not always have the most recent versions of everything, but it's added protection against a catastrophic user error, giving you a few days of leeway, typically.

This system has been running for a couple of years, now, and works very well. It's not perfect, of course. The drives are all in the same physical location, so a fire or flood could wipe them all out. But in terms of protecting the files on active projects, our most important files, it's a good way to provide a couple of layers of protection.

There are other strategies we're considering adding, and this is a constantly-evolving process, usually triggered by inadvertent data loss. Long-term data archival storage is a different subject, for a different post.

In the mean time, chime in an share your backup strategies!

Support PixelCadabra Via Our Amazon Main Page

2 comments:

Anonymous said...

In my home office I have a NAS device which has a capacity of 200 Gigs. It's a pretty simple NAS that takes a regular hard drive, so if I ever need to increase capacity I can just change the hard drive. My computers on the network are setup to run incremental backups and these are all sent to the NAS. I prefer doing incremental backups so I can go back to previous versions of files if I needed to and the backup jobs also run a lot faster if there hasn't been a lot of changes since the last backup. And I make sure to do a media backup (CD, DVD or tape) every few weeks just in case the backup drive crashes.

Dave Bittner said...

A NAS (Network Attached Storage) is a nice option, to be sure. I've considered one, but haven't heard good things about data rate performance.
On my laptop I've got Time Machine running, as well as a regularly scheduled backup. My internal HD is 120 gigs, and I have an external FW drive that's 300 gigs. I partitioned it into two equal volumes, one for time machine and one for regular backups. (I use Silver Keeper for the regular backups.) Haven't had to make use of Time Machine yet!