If you haven’t gotten around to setting up regular backups of your website MySQL databases you are asking for serious trouble. In this article I am going to provide shell script examples that you can use to quickly setup database backup jobs. As far as scope goes, we are going to be talking about how to perform a backup to a local directory, compress that backup file, and then port it off to a remote location if you so desire. The following method has worked really well for me and I hope it does for you as well.

I run about 9 different websites off of a single dedicated server that runs a full LAMP stack (Ubuntu, Apache Web Server, MySQL Database Server, and PHP). I have full root access to my server. You might not have that if you are on a shared hosting provider, however many shared hosting providers account for this and have methods for setting up cron jobs. You will need to adapt this to your environment.

Create a New MySQL User Account for Taking Backups
First things first, to make things simple you can use PHPmyAdmin or the MySQL command-line and set up a new database user called “backups” that has only the necessary privileges to all of your databases. You could also use your root MySQL account but I would consider that a major security risk. To quickly setup a new user in mysql you can use the following commands from the shell:

Login to MySQL with the following (supply your root MySQL password at the prompt):

shell> mysql --user=root -p

Your shell should change and show “mysql>” in front, indicating you are running commands in your MySQL instance.

Setup a new user (supply a password in place of some_pass, keep the quotation marks as-is):

CREATE USER 'backups'@'localhost' IDENTIFIED BY 'some_pass';

Okay, you now have a user called “backups” and you should know the password (write it down somewhere temporarily) for said user. This user should have the SELECT, TRIGGER, SHOW VIEW, & LOCK TABLES privileges across all databases which should allow backups to be taken in just about any MySQL scenario.


Install Zip/Unzip on Linux
Next… I like to use “zip” to compress and archive my database files as the zip format is a bit more portable across operating systems (Windows supports it natively). Zip isn’t installed by default on most linux distributions. Here is what you need to do to get it installed on Ubuntu. From the command line:

sudo apt-get install zip unzip

Setup Directories for Storing and Manipulating Backups
The shell script we are going to use to take backups needs some directories setup to store our data. The below commands assume you want to use the same hierarchy that I do. Wherever you see “username” you need to changes this to your username on the server as we are working out of a home directory.

mkdir /home/username/backups
mkdir /home/username/backups/database

Setup Your Shell Script for Taking the Backups

I this section, I am going to just provide my script and then discuss it. So here it is:


DATE=`date +%Y%m%d`

mysqldump -h localhost --opt -u backups --password=some_pass website1dbname > /home/username/backups/database/website1dbname-backup-$DATE.sql
zip /home/username/backups/database/website1dbname-backup-$DATE.zip /home/username/backups/database/website1dbname-backup-$DATE.sql
rm /home/username/backups/database/website1dbname-backup-$DATE.sql
  • “#!/bin/bash” – declares this is a bash script.
  • “DATE=`date +%Y%m%d`” – This declares a variable called “DATE” that we will reference below. It pulls the current date in the format of year-month-day.
  • “mysqldump -h localhost –opt -u backups –password=some_pass website1dbname > /home/username/backups/database/website1dbname-backup-$DATE.sql” – This runs mysql dump, it says it is running against the local server and then we put in the “u”sername (backups), the password (some_pass… which you need to change to your password) and finally “website1dbname” is the name of the database in mysql. The “>” says we are taking that database and dumping it to a local file called “website1dbname-backup-$DATE.sql” and our $DATE will dynamically fill in whatever the day’s date is that this is run. So we can keep our files organized…
  • “zip /home/username/backups/database/website1dbname-backup-$DATE.zip /home/username/backups/database/website1dbname-backup-$DATE.sql” – This line of code runs the zip program we installed earlier and compresses our sql file into a zip archive file which saves a LOT of space as SQL files compress down quite a bit…
  • rm /home/username/backups/database/website1dbname-backup-$DATE.sql – Finally, this last bit of code deletes the SQL file as we don’t need both the compressed and uncompressed file.

Multiple Database Backups and Cleaning up old Backups
Okay, this script, as is, will successfully make a local backup of the database for one website. Now I am going to show a slightly expanded version of the script that backs up a second database (by adding a new section) and also does house-cleaning. Space is limited, so I only keep 90 days of daily database backups on hand. Here we go:


DATE=`date +%Y%m%d`

mysqldump -h localhost --opt -u backups --password=some_pass website1dbname > /home/username/backups/database/website1dbname-backup-$DATE.sql
zip /home/username/backups/database/website1dbname-backup-$DATE.zip /home/username/backups/database/website1dbname-backup-$DATE.sql
rm /home/username/backups/database/website1dbname-backup-$DATE.sql

mysqldump -h localhost --opt -u backups --password=some_pass website2dbname > /home/username/backups/database/website2dbname-backup-$DATE.sql
zip /home/username/backups/database/website2dbname-backup-$DATE.zip /home/username/backups/database/website2dbname-backup-$DATE.sql
rm /home/username/backups/database/website2dbname-backup-$DATE.sql

find /home/username/backups/database/* -type f -mtime +90 -delete

The new section of the script is pretty self explanatory. Because our backup user had global privileges you can use the same exactly lines of code, just change the database name everywhere it appears.

The final line shown above “find /home/username/backups/database/* -type f -mtime +90 -delete” does our housekeeping for us. It uses the “find” command to list all of the files in the database backup directory that are older than 90 days and then it deletes those files. This will keep only 90 days of backups on hand. You can adjust up or down based on your retention requirements.

At this point, everything we need to take local backups of our database is in place. However, if you are extra security conscious and are lucky enough to have second linux server available you might want to also port copies of your database to a second server for safe keeping in the event your primary server blows up… For this task we can use rsync… and I will write a follow-up article on how to set that up in the near future. But for now… cron.

Build Your Script & Set it to Run as a Daily Cron Job
Okay, last but not least… we need to actually create our script and get cron to run it for us. So here we go…

This section assumes you have full access to your server (so for most shared hosting this section might not apply all that well). If you are on a server where you don’t have full control/access you need to talk to your hosting provider and ask them what their recommendation is for getting a cron-job to run a bash script for you.

First, copy and paste all of this into a blank text document and then edit it as you need based on everything we discussed above… Specifically change “some_pass”, “website1dbname”, and “username” wherever they appear to match your environment. Copy, paste, and modify addition mysqldump sections as needed. Also, change the +90 to however many number of days you want to keep backups for.


DATE=`date +%Y%m%d`

mysqldump -h localhost --opt -u backups --password=some_pass website1dbname > /home/username/backups/database/website1dbname-backup-$DATE.sql
zip /home/username/backups/database/website1dbname-backup-$DATE.zip /home/username/backups/database/website1dbname-backup-$DATE.sql
rm /home/username/backups/database/website1dbname-backup-$DATE.sql

mysqldump -h localhost --opt -u backups --password=some_pass website2dbname > /home/username/backups/database/website2dbname-backup-$DATE.sql
zip /home/username/backups/database/website2dbname-backup-$DATE.zip /home/username/backups/database/website2dbname-backup-$DATE.sql
rm /home/username/backups/database/website2dbname-backup-$DATE.sql

find /home/username/backups/database/* -type f -mtime +90 -delete

Once everything is the way you want it, open up an SSH session to your server, login, and then give yourself a root shell like this:

sudo -s

Now we are going to create blank text document and copy and paste our script into it. The document is going to be created in the cron.daily directory which means our script will be ran once a day once we are finished.

vim /etc/cron.daily/mysql-backup

That opens up a new VIM session, copy and paste your code in (“i” puts you in “insert” mode, then paste your code in, then hit “esc” key) and then save and close (“:” + “wq!” + enter key).

Make your file executable:

chmod +x /etc/cron.daily/mysql-backup

If you want to test your script file you can execute it manually with the following:


You should now have a script file that run daily that dumps all of your MySQL databases to separate archive files on your local disk. In the near future I will explain how to setup rsync and then we will append one final line of code to our script to sync our database backups to a remote location.




1 of 1

Leave a Reply