Daniel Dvorkin

My take on WordPress and related geekery


Bash script to backup all your MySQL databases

Backing up all your MySQL databases one by one is a pain. Here is a small bash script I made to dump and compress all my databases to my Dropbox folder.

  • The script will skip any database whose name starts with an underscore, so I can have test or junk databases that I don’t want to back up.
  • Every time it runs, it’ll delete all the previous backups (I don’t care, but if you do then just comment out line 7)
  • You need to change lines 3, 4 and 5 to reflect your MySQL user, password and folder where you want to put the dumps.


rm "$OUTPUT/*gz" > /dev/null 2>&1

databases=`mysql --user=$USER --password=$PASSWORD -e "SHOW DATABASES;" | tr -d "| " | grep -v Database`

for db in $databases; do
    if [[ "$db" != "information_schema" ]] && [[ "$db" != _* ]] ; then
        echo "Dumping database: $db"
        mysqldump --force --opt --user=$USER --password=$PASSWORD --databases $db > $OUTPUT/`date +%Y%m%d`.$db.sql
        gzip $OUTPUT/`date +%Y%m%d`.$db.sql

Now you just need to make it executable:

chmod 700 backup.sh

And then add it to the crontab so it’ll run automagically:

crontab -e
00 20 * * * /path/to/backup.sh

In this case it’ll run every day at 8 PM. You can learn more about crontab here.


WordPress XML-RPC Helper Class


Take ownership of a WordPress database


  1. Sam

    Nice job man! Just what I needed. Thanks for sharing.

  2. Brimiru

    Although it would require a few more additions to manage deleting the proper files, you might want to move the rm *.gz part to the end of the script so as to ensure that you don’t delete your last good backup before a potential failure of the new backup. 😉

  3. excellent code..
    work very very fine and is easy to understand..
    I appreciate that you shared..
    thanks a lot!

  4. Warface

    Shouldn’t the $OUTPUTDIR be more like $OUTPUT since OUTPUTDIR isn’t set nowhere on line 7 ?

  5. Matthew Clark

    Line 7 incorrectly references $OUTPUTDIR (should be $OUTPUT).

    Also, potential users should note that if you do decide to remove line 7 as mentioned by the author (to keep compressed files), and run the script more than once in the same day, you’ll get errors when gzip tries to compress the SQL files (duplicate .gz files will exist).

  6. Marco

    You probably mean:
    rm "$OUTPUT/*gz" > /dev/null 2>&1
    instead of:
    rm "$OUTPUTDIR/*gz" > /dev/null 2>&1
    Thanks a lot for the excellent work!!

  7. matias

    me lo guardo, muy simple, gran ayuda para migrar servers, gracias por compartirlo

  8. Jeevanand

    Working fine and very easy to understand, Nice Code Dude.. :-)

  9. James

    There’s a bug in this script that’ll cause it to remove all your *gz files from root:


    rm “$OUTPUTDIR/*gz” > /dev/null 2>&1


  10. Sapta Aji Yudistira

    Thank you so much, to share this code, it’s really easy to understand, once again, thank you so much.

  11. r0n

    You could also use MySQL switches that are quite interesting when gaining data through a script. I usualy prefer

    result=$(mysql -u SOMEUSER -pXXX -BNse”SELECT some_fields from table where foobar”)

    where -B is for “batch-mode”, -N suprresses field names (first row) and -s makes it silent, so less verbose output (e.g. no “ascii-table output) and -e executes the query.

    There are also situations where you need to set your $IFS to newline only, so that you can work with the result variable we used above – it contains all the lines found by the SELECT query. Do not forget to unset $IFS afterwards!

    # other code
    unset IFS

    Furthermore you should always check the $? variable right after your important commands to see if they actually succeded.

    databases=`mysql –user=$USER –password=$PASSWORD -e “SHOW DATABASES;” | tr -d “| ” | grep -v Database`
    if [ ${?} -eq “0” ]; then
    # go on with code
    # exit and report because command failed

    Have fun!

  12. Jeremy

    This worked perfectly for me on the first attempt. Hooray! Thanks for sharing this code. Awesome.

  13. alberto

    may be can you complet the script send your database by email , for example to yahoo that is 1 TB free

    mysqldump -e –user=username –password=password dbname | gzip | uuencode dbbackup_e.gz | mail email@address.com

  14. You script is a little wrong

    rm “$OUTPUTDIR/*gz” > /dev/null 2>&1
    it should be
    rm “$OUTPUT/*gz” > /dev/null 2>&1

  15. T

    working as a charm. Thanks!

  16. Kris

    Thanks for the script. Could you explain what the following on line 10 does,
    | tr -d “| ” | grep -v Database`

  17. William

    Great script, thank you!

    Found something though:
    Line 5: “OUTPUT”
    Line 7: “$OUTPUTDIR”

  18. Thanks for all the comments about $OUTPUTDIR. Fixed it!

  19. Herz

    Very useful, thank you.

    Here is my version, which removes backups that are 48 hours old.


    # if the password contains characters such as #@ you might need to change the line –password=$PASSWORD for –password=’you_password_here’
    # Otherwise mysql wont parse the password correctly.

    # find files modified more than 48 hours ago
    find /home/user/db_back -name “*.gz” -type f -mtime +1 -delete

    databases=`mysql –user=$USER –password=$PASSWORD -e ‘SHOW DATABASES;’ | tr -d ‘| ‘ | grep -v Database`

    for db in $databases; do

    if [ $db != ‘information_schema’ ] && [ $db != _* ] && [ $db != ‘performance_schema’ ] && [ $db != ‘mysql’ ]; then
    echo “Dumping database: $db”
    mysqldump –force –opt –user=$USER –password=$PASSWORD –databases $db | gzip > $OUTPUT/$db-$(date +%m-%d-%Y-%r).sql.gz

  20. Dany Aguilar

    Extremely useful! nice article! thanks!

  21. Ramprasad

    Thanks !!! My whole day work is now simple fast

  22. developerbmw

    The use of tr to remove the table formatting characters is unnecessary since MySQL outputs raw data if it’s not going to a TTY. Works on MySQL 5.5.46.

  23. Romeu Gamelas

    Just changed this:

    #So you can exclude databases you don’t want to backup

    EXCLUDE=”mysql information_schema performance_schema phpmyadmin”
    databases=`mysql –user=$USER –password=$PASSWORD -e “SHOW DATABASES;” | tr -d$

    for EX in $EXCLUDE
    databases=`echo $databases | sed “s/\b$EX\b//g”`

    for db in $databases; do
    if [[ “$db” != _* ]] ; then

  24. You can also have a look at this script, which will take care about password security, automated encryption and compression and integrates well into Nagios:


  25. jamespsterling

    Very helpful for dumping each individual db, thanks!

  26. Alex K

    I am using this on Centos 7 x64 MySQL 5.6.30 but it is not working getting the following error:

    [root@fr ~]# cd /backup/
    [root@fr backup]# sh backup.sh
    backup.sh: line 2: $’\r': command not found
    backup.sh: line 6: $’\r': command not found
    : ambiguous redirect
    backup.sh: line 8: $’\r': command not found
    Warning: Using a password on the command line interface can be insecure.
    ‘@’localhost’ (using password: YES)or user ‘root
    backup.sh: line 10: $’\r': command not found
    backup.sh: line 11: syntax error near unexpected token `$’do\r”
    ‘ackup.sh: line 11: `for db in $databases; do

    • Romeu Gamelas

      You probably edited the file with a bad text editor and it re-formatted the line breaks to its escaped version (\r means “carriage return”, for systems that allows implicit “line feed” – \r\n otherwise).
      So just reedit it and remove all escaped line breaks (delete all \r) from the text.

  27. Hannes Lober

    you can leave out the “| tr -d “| ” | grep -v Database`” part, if you use mysql with –batch –skip-column-names

  28. Martijn

    Basic, easy to understand script! Thankyou!
    Only 1 Issue with this.
    You create the dump and write it to yearmonthday.dbname.sql. After that GZIP will compress the created dump. Problem with that is when you run this script for several database at the end of the day ( for example due to heavy load on servers at midday ) and a specific dump is ready when its already the next day, gzip won’t be able to find the file just created by mysqldump.

    Save the date in a variable.
    You can also include the hour and minutes when you create multiple backups per day:

    DATE=`date +%Y-%m-%d_%H:%M`
    mysqldump –force –opt –user=$USER –password=$PASSWORD –databases $db > $OUTPUT/$DATE.$db.sql
    gzip $OUTPUT/$DATE.$db.sql

  29. Sanjay Pateliya

    Thank you.. This script look simple to understand. but
    I am getting this error while execute this script

    “Warning: Using a password on the command line interface can be insecure.
    ERROR 1045 (28000): Access denied for user ‘root’@’localhost’ (using password: YES)”

  30. Hi , i need a script which takes backup table by table of a mysql database and the script should have start time and end time of backup ,there should be a log file written to a seperate directory other than /var/log and i should display the start time and end time of backup and if backup is successful or failure ,the issue should be displayed in the mail along with total time of backup. please help me with this Guys

Leave a Reply to Sanjay Pateliya Cancel reply

Powered by WordPress & Theme by Anders Norén