Simple Server Backups
Login | Register RSS
01/30/2012 - Linux 

Simple Server Backups

If you have spent any amount of time researching how to perform backups on Linux systems, you will find there is a wide array of backup options available.  Many solutions require dedicated systems, special applications or lots of custom scripting to implement.  Some of the more straight forward apps/solutions, simply don't seam to achieve what I'm want to do (keep a copy of the data on the local system &/or FTP/SSH a copy of the data off-site).

I'm not knocking anyone else's solutions for Linux backups.  If anything, I commend them for all of their hard work & for sharing their solutions with the world, so we in the Unix/Linux community have a wider array of options to select from.  With my growing network, I've actually been looking at more advanced backup solutions lately.  But at present, I have opted to keep it simple for both my home PCs, office workstation & my personal web servers.


This is a basic scripted version, when run from the command line or through a cron job, it will backup the important data & MySQL data on the same system.

#!/bin/bash
# backup important data & all SQL databases
tar -pcv --exclude-from /root/backup_exclude.txt --file /backup/system_backup.tgz /root /etc /home /var
mysqldump -u root -p password --opt --all-databases > /backup/mysql_databases.sql 

Usage Notes:

  • The above example assumes you want to backup all of "/root", "/etc", "/home" & "/var" to a file named "system_backup.tgz"; which is a tar gzipped file in the "/backup" folder.
  • If you wish to exclude a specific file or folder from the backup, simply append a new line to the "/root/backup_exclude.txt" file with the absolute path to the file/folder to exclude.
    Just remember to remove the leading "/" from the file/folder path entered.  (i.e. enter "home/user/MP3s" if you want to exclude "/home/user/MP3s")
  • The above example also assumes you want to backup all mysql data from the local host, using the mysql login of 'root' username & 'password' password.
    Adjust the line as necessary for your particular mysql server or simply comment out the line if you don't want to backup mysql data.

This is a slightly more advanced version of the above script, which does the same as above, but separates "/root", "/etc", "/home" & "/var"sections into different files.

#!/bin/bash
# backup important data & SQL databases
tar -pcv --exclude-from /root/tar_exclude.txt --file /backup/system_backup_root.tgz /root
tar -pcv --exclude-from /root/tar_exclude.txt --file /backup/system_backup_etc.tgz /etc
tar -pcv --exclude-from /root/tar_exclude.txt --file /backup/system_backup_home.tgz /home
tar -pcv --exclude-from /root/tar_exclude.txt --file /backup/system_backup_var.tgz /var
mysqldump -u root -p password --opt --all-databases > /backup/mysql_databases.sql


If you also want to FTP the backup files to a remote server for extra redundancy, you add the following after doing the above:

# FTP backup data to remote server via curl
curl -T /backup/system_backup_root.tgz -u root:password ftp://192.168.1.2/backup/
curl -T /backup/system_backup_etc.tgz -u root:password ftp://192.168.1.2/backup/
curl -T /backup/system_backup_home.tgz -u root:password ftp://192.168.1.2/backup/
curl -T /backup/system_backup_var.tgz -u root:password ftp://192.168.1.2/backup/
curl -T /backup/mysql_all_databases.sql -u root:password ftp://192.168.1.2/backup/

Usage Notes:

  • The above example assumes you want to FTP upload a copy of those files to a server located at '192.168.1.2' in the "/backup" folder.
  • The above example also assumes you will use you the FTP login of 'root' username & 'password' password.  Adjust this line as necessary for your particular FTP server.

If you want to SSH the backup files directly to a remote server, you can do something like this:

#!/bin/bash
# backup important data & SQL databases
tar -pcv --exclude-from /root/tar_exclude.txt /root | ssh root@192.168.1.2 "cat > /backup/system_backup_root.tgz"
tar -pcv --exclude-from /root/tar_exclude.txt /etc | ssh root@192.168.1.2 "cat > /backup/system_backup_etc.tgz"
tar -pcv --exclude-from /root/tar_exclude.txt --file /home | ssh root@192.168.1.2 "cat > /backup/system_backup_home.tgz"
tar -pcv --exclude-from /root/tar_exclude.txt --file /var | ssh root@192.168.1.2 "cat > /backup/system_backup_var.tgz"
mysqldump -u root -p password --opt --all-databases | ssh root@192.168.1.2 "cat > /backup/mysql_databases.sql"

Usage Notes:

  • The above example assumes you want to SSH upload those files to a server located at '192.168.1.2' in the "/backup" folder.
  • You will be promoted for the SSH login password, if you have not already setup a SSH key before hand.
    If you will be invoking this backup process via a cron job, you should use SSH keys to get rid of the SSH password requirement.

And what is a backup solution, without explaining how to restore the data.

To restore the tar gzipped data, upload the data to the affected system & run this command as root:

tar -tpzvf /backup/system_backup.tgz /To restore the MySQL data , run this command:

mysql -u root -p password < /backup/mysql_all_databases.sql;

Usage Notes:

  • You will need to adjust any login information, file names & file paths accordingly.
  • These commands also assume you want to restore all of the information contained in the backup.
  • If you wish to selectively restore data from the backups, you should refer to the man/help pages for those particular commands, for how to adjust the above to achieve that.

As you will find, this is a pretty basic way to backup & restore the data on a given Linux system.  Your mileage may vary; as there are slight command line differences in how a given command works from one Linux version to another.  My write-up is based on it being used with different releases of CentOS & Fedora.  Use at your own risk.  This write-up is designed to help guide you in how to do things a certain way.  My way may not be the best solution for your particular situation or circumstance.

Please remember to work smart & think about security first.  You should 'chown' & 'chmod' all backed up data to limit its access to only its owner &/or encrypt the data if it contains sensitive information (such as credit cards, social security numbers, etc...).  Take the time to review the commands I have used above & fully understand what they do, before you attempt to use them; especially when you are going to restore data.  If you are unsure how things will play out, you should mock-up the system (such as in VirtualBox) & then try backing up / restoring the data there, before you mess with a mission critical system or with something which contains sensitive data.  Remember that if you take the time to do things right, it will help ensure you'll be OK when the inevitable happens.

Also remember that even though you are backing up your data, don't assume everything is ok.  You should periodically check the backup process to ensure the backups are being made properly.  Do a test restore every once in a while to ensure that your restoration process still works.  And keep more then revision of your backup on hand. It's useless to have backups, if you cannot safely/properly restore the data or if one of your backups was somehow corrupted.  I like to build a mock setup in VirtualBox for restoration testing, so I'm not messing with a live system.  I also like to rotate my remote backup copies, so that I have at least 3 or 4 clean backups in case of data corruption.


If you like this site or any of its content, please help promote it. Use the social media buttons below to help spread the word. Don't forget to post in the comments section.

  Print   Email