Automatically Back Up Your Web Site Every Day
|Website Backup with Rsyn|
I run several database-driven sites and applications, including this blog, so my backup system has to be solid. Here's how I have it set up.
This method assumes a few things:
- You're running a LAMP-based web site (Linux, Apache, MySQL and PHP/Perl/Python).
- You have command line access to your web server via SSH.
- You know how to make new folders and chmod permissions on files.
- You're comfortable with running bash scripts at the command line on your server and setting up cron jobs.
- You know where all your web server's files are stored, what databases you need to back up, what username and password you use to log into MySQL.
- In order to have remote data backup, you need access to another server that's available via SSH in addition to your site's server. I asked a friend of mine for an account on his server to store some backup files and he kindly obliged. If you don't have a friend with a server at a different host, you can run an always-on server at home and back up to there. I prefer not to have a computer on at all times in my home, where bandwidth speeds can be slow, so I'd recommend finding a friend to back up to (and you can offer your friend the same courtesy).
All systems go? Let's get your backup system set up.
First: Local Backup
In order to back up your web site, your script has to back up two things: all the files that make up the site, and all the data in your database. In this scheme you're not backing up the HTML pages that your PHP or PERL scripts generate; you're backing up the PHP or PERL source code itself, which accesses the data in your database. This way if your site blows up, you can restore it on a new host and everything will work the way it does now.
First, SSH into your web server, and in your home directory, create a folder named backups. Inside this folder, create a file named backup.sh. Then, create a folder name files.
Here's what the result should look like:
The file we care about right now is backup.sh. This file will be the script that zips up your data and saves it in the files.
The script I run is heavily based on an example I found on The How-To Geek's wiki. Here's the source code of backup.sh that takes care of smarterware.org's files and database:
Copy and paste this source code into your backup.sh file. To successfully run this script in a setup similar to mine, on lines 3 through 7, you must replace abc.com, my_database_name, my_database_user, and my_database_password with the right values for your web site.
This version of the script makes two assumptions about file locations. On my web server (and many, but not all setups), my home directory is a path that looks like this: /var/www/vhosts/example.com/ (where example.com is your web site domain). All of the public, web-accessible files are located in /var/www/vhosts/example.com/httpdocs/ (where example.com is your web site domain).
Your web site file path may vary. If it does, in the script's source code, replace /var/www/vhosts/$THESITE/backups/ with the path to your backups folder location, and replace /var/www/vhosts/$THESITE/httpdocs/ with the location of your site's web-accessible files.
Let's walk through what this script is doing. After setting some variables in lines 3 through 7, line 9 is running a mysqldump of all the data in the database named in line 4, archiving it, and storing it in the files directory using a filename that looks like dbbackup_example.com_1402120101.bak.gz.tar. Line 11 and 12 are archiving the site's source code files in the httpdocsdirectory, and storing them in the files directory, using a filename that looks like sitebackup_example.com_1402120101.tar. Notice both these filenames include the date, so you can see when the backup was made.
Finally, lines 14 and 15 are deleting any backups made more than 5 days ago. You're going to run this backup script nightly, and the files will take up a lot of space quickly. That's why these last commands delete older backups. You can change the number 5 to any number of days you want to keep backups from.
In order to run this script, you must chmod +x backup.sh. Run it manually to make sure it generates the backup files you expect. Finally, schedule it to run as often as you like in your crontab. To run it at 1:01 am every morning, your crontab would look like this:
Replace the username, web site name, and paths with your information.
That command will sync all the files in your host's backups folder to your remote server's offsitebackups folder. Run it to make sure it works. It should prompt you for the password to log into your web server when you do. When it's done syncing, you should see your backup files in the offsitebackups folder.
The problem is, you won't be around to enter the password every night when cron tries to run it. To run it without intervention, you'll need to set up passwordless login into your web server. This excellent tutorial on automating backups with rsync runs down those steps as well.
Setting up local and remote, database and file backup of your web server requires upfront time and effort, but once you've set it up, you can forget it. Using this system you can blog away, get your blogging software up-to-date, and manually edit files directly on your web server without having to worry about losing changes or not being able to restore your data ever again.