So you think your web host makes backup copies of your website regularly? Think again. While most hosting companies backup their servers on a daily or weekly basis, you don’t have access to these backups. You need to contact them in order to restore your website from a backup copy. These backups are only meant to restore an entire server in case of failure.
Website and MySQL Backup Script
This simple backup script allows you to backup your website files as well as your MySQL databases. The backup file can also be encrypted using OpenSSL before being uploaded to a remote server.
Supported destination types:
- FTP
- SFTP
- Amazon S3
Requirements
While this script has been tested on Linux CentOS 6.5 and 7.2, it’s most likely to work on all Linux distributions. If some Perl modules are missing, you will get an error message similar to this one:
Can't locate Net/SFTP.pm in @INC (@INC contains: /usr/local/lib64/perl5 /usr/local/share/perl5 /usr/lib64/perl5/vendor_perl /usr/share/perl5/vendor_perl /usr/lib64/perl5 /usr/share/perl5 .) at ./backup.pl line 225. BEGIN failed--compilation aborted at ./backup.pl line 225.
GMP Libraries
Before installing the required Perl modules, make sure that the GMP libraries are installed on your server.
On CentOS Linux, you can install gmp and gmp-devel using this command:
sudo yum install -y gmp gmp-devel
Perl Modules
NOTE: If you install Perl modules as root, they will be available to all the users on the server. However, Perl modules can also be installed by a regular user account. In this case, all the required files will be copied into this user’s directory. Installing the modules as root is the preferred way to go in my opinion.
Installing Perl modules can be tricky as CPAN does not always solve dependencies correctly. Here’s how you can fix this:
# cpan cpan[1]> o conf prerequisites_policy follow cpan[2]> o conf commit cpan[3]> quit
Another option is to use App::cpanminus to install modules with dependencies. Use CPAN to install cpanminus:
# cpan cpan[1]> install App::cpanminus
The following modules are required by the FTP and SFTP modules. It’s likely that they are installed by default on most servers.
cpanm Env cpanm File::HomeDir cpanm Net::SSH cpanm Math::GMP
Now you can install the FTP and SFTP modules required by the backup script:
cpanm Net::FTP cpanm Net::SFTP
AWS CLI
If you wish to upload your backup to an AWS S3 bucket, you must install the AWS Command Line Interface. This tool allows you to control multiple AWS services from the command line.
Here’s how to install AWS CLI on a Linux server:
$ wget https://s3.amazonaws.com/aws-cli/awscli-bundle.zip $ unzip awscli-bundle.zip $ sudo ./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/aws
OpenSSL
To encrypt your backups, you must have OpenSSL installed. Here’s how to install it on CentOS Linux:
sudo yum install -y openssl
Backup Script Configuration
Before editing the configuration, you must create a directory into which the backup file will be placed:
$ mkdir /home/johndoe/backups
Next, rename the backup script from backup.txt to backup.pl and open it in your favorite (non-formatting) text editor.
While the script has a lot of parameters to configure, they are pretty much self-explanatory.
First, start by specifying a passphrase to encrypt your backup file:
# OPENSSL ENCRYPTION PASSWORD. LEAVE EMPTY FOR NO ENCRYPTION (NOT RECOMMENDED). my $password = "my-secure-passphrase";
If you want the local backup file to be deleted once it has been uploaded to a remote server, set this parameter value to 1:
# DELETE LOCAL BACKUP AFTER UPLOAD (0 = no, 1 = yes) my $delete_backup = 0;
Next, specify the directories you wish to backup:
# ENTER THE DIRECTORIES TO BACKUP, NO TRAILING SLASH. SEPARATE DIRECTORIES BY A BLANK SPACE. my $directories = '/home/johndoe/public_html /home/johndoe/mail';
If you want to exclude sub-directories from the backup, create a text file named exclusions.txt and list all the files and folders you wish to exclude. Upload this file to your server as it must be readable by the backup script.
Here’s an example:
/home/johndoe/public_html/cache /home/johndoe/public_html/error_log /home/johndoe/public_html/another-file.txt /home/johndoe/public_html/another-folder
If you created an exclusion file, specify its name, and its full path in the backup script:
# ENTER THE FULL PATH TO THE FILE CONTAINING THE FOLDERS AND FILES TO EXCLUDE FROM THE BACKUP. LEAVE BLANK FOR NO EXCLUSIONS. my $exclusion_file = '/home/johndoe/scripts/exclusions.txt';
Next, configure the MySQL parameters if you need to backup some databases:
# MYSQL BACKUP PARAMETERS my $mysql_backup = 1; # 0 = Disable MySQL backup, 1 = Enable MySQL Backup my $dbhost = 'localhost'; my $dbuser = 'johndoe_dbuser'; my $dbpwd = 'password'; my $backup_all_databases = 'no'; # IF SET TO NO, SPECIFY INDIVIDUAL DATABASE NAME(S) BELOW
If $backup_all_databases is set to ‘no’, specify the database name(s) to backup:
# ENTER DATABASE NAMES TO BACKUP SEPARATED BY SPACES (ONLY IF backup_all_databases IS SET TO 'no') my $database_names = 'johndoe_db1 johndoe_db2 johndoe_db3';
Define where the backup file will be placed. This is the directory you created at the beginning of this section:
# ENTER THE PATH TO THE LOCAL DIRECTORY YOU WISH TO SAVE THE BACKUP FILE TO, NO TRAILING SLASH my $local_backup_dir = '/home/johndoe/backups';
If you wish to upload the backup to a remote server, start by specifying the destination type. Set this parameter to ‘none’ to disable the upload.
# SPECIFY BACKUP DESTINATION TYPE. POSSIBLE VALUES: # 1. ftp: Transfer backup over FTP. # 2. sftp: Transfer backup over FTP. # 3. aws: Transfer backup to Amazon Web Services S3 container. # 4. none: Do not upload the backup to a remote server. my $destination_type = 'aws';
If you specified ‘ftp’ or ‘sftp’ as the destination type, configure the following parameters:
# FTP/SFTP PARAMETERS my $ftp_host = "ftp.your-remote-server.com"; my $ftp_port = 21; my $sftp_port = 22; my $ftp_user = "username"; my $ftp_pwd = "password"; my $ftp_dir = "/path/to/remote/destination/directory"; my $ftp_debug = 0; # Set to 0 to disable, 1 to enable
On the other hand, if you wish to upload your backup to an Amazon S3 bucket, you must provide your AWS Access Key ID and your AWS Secret Access Key. If you need to create a new access key, read the documentation here.
# AMAZON WEB SERVICES S3 PARAMETERS $ENV{"AWS_ACCESS_KEY_ID"} = 'AKIAIOSFODNN7EXAMPLE'; $ENV{"AWS_SECRET_ACCESS_KEY"} = 'wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY';
The S3 bucket and folder(s) must already exist as the backup script will not attempt to create them:
my $aws_bucket = 'my-s3-bucket-name'; my $aws_path = 'my-s3-bucket-folder';
You can alter the name of the files that will be created by the backup script. However, it is recommended to leave them to their default values:
# BACKUP FILE NAMES OPTIONS my $homedir_backup_file = "homedir_backup-$year$month$day-$hh$mm$sec.tar.gz"; my $full_backup_file = "backup-$year$month$day-$hh$mm$sec.tar.gz"; my $mysql_backup_file = "mysql-$year$month$day-$hh$mm$sec.sql.gz";
If some of the system commands used by the script are residing in another location, you can set them here:
# SYSTEM COMMANDS my $cmd_tar = '/usr/bin/tar'; my $cmd_mysqldump = '/usr/bin/mysqldump'; my $cmd_gzip = '/usr/bin/gzip'; my $cmd_openssl = '/usr/bin/openssl'; my $cmd_aws = '/usr/bin/aws';
Uploading the Backup Script and Setting Permissions
You can use any FTP/SFTP client to upload the backup script to your web server. If you don’t know where to put it, create a directory named scripts at the root of your account. Upload the backup script (backup.pl) and the exclusion file (if you created one) to this directory.
The content of your directory should now look like this:
/home/johndoe/scripts/backup.pl /home/johndoe/scripts/exclusions.txt
Change the permissions on backup.pl to make it executable. You can achieve this by using the command line or your FTP client software.
Use this command if you have shell access:
$ chmod 0755 /home/johndoe/scripts/backup.pl
Otherwise, you can use your FTP client to set the permissions:
Executing the Backup Script
The backup script can be run from the command line or by using cron jobs. Using the command line, run the script like this to perform a backup manually:
$ /home/johndoe/scripts/backup.pl
If you want to backup your website automatically, you must use cron jobs. On a cPanel server, click the Cron Jobs icon from the Advanced panel:
Next, specify the frequency at which the backup will occur and enter the command to execute the script. In the example below, the backup will run daily at midnight:
Decrypting a Backup File
If the backup file is encrypted, you must use the openssl command to unencrypt it before you can extract its content:
$ openssl aes-256-cbc -d -a -in backup-20160401-144138.tar.gz.enc -out backup-20160401-144138.tar.gz -pass pass:my-secure-passphrase
You will now be able to extract the content from the archive:
$ tar xvzf backup-20160401-144138.tar.gz
If you find some bugs or if you encounter some issues with the script, leave a comment below!