Overly Complicated Project: FTP backups on local network

With the revision of my Cloud backups from CrashPlan to Backblaze, I lost the ability to handle backups from a network share. There are complicated ways to install devices and make network shares work as local disks, but it seems like a mixed bag of results. I’ve had the idea for a while now and I decided to make it happen: FTP backups on my local network from any Linux systems and possibly Windows.

Goal: Backups to be as easy as possible and require little to no future proofing after install. Think I got it down.

Here’s the premise: I want to automate my Linux systems so they automatically backup and archive everything, and then upload these tarballs into an FTP server which also backs up to my cloud provider (Backblaze); This was pretty straight forward since everything is LAN and Perl has a very simply FTP set up.

Perl has a decent module for FTP (and SFTP!) that I used. With this, the script will run off a crontab on the Linux machine I’m testing on and will:

  • Archive and compress all files in my webroot directory
  • Backup all related mysql databases
  • Compress this all into a tgz
  • Upload this tarball to the FTP service running on my backups server
  • This server will then upload to my cloud backups

With some fine tuning, I can probably run this in some form across many of my Linux systems (I can count 5 off hand currently) and provide a backup solution for all of these now with almost no further work after set up/staging. Could this be done better? Probably. A slick SFTP or SCP solution running Cygwin on Windows would be sweet but this is a quick and easy way to do this.

Props to FileZilla project for their server software and to ZEROF at GitHub for this prime example: https://gist.github.com/ZEROF/571e1707880c97d97f03

Cheers and happy backups!

Leave a comment