Ram Laxman Yadav bio photo

Ram Laxman Yadav

Ruby on Rails Developer

Email Twitter LinkedIn Github Stackoverflow

Hello friends, I am back with the new problem i.e how to take automatic database dump and delete old ones.

Now here is an to solve this problem. Here we go:

###Make a .pgpass file in the home directory Make a file named .pgpass and write the following code inside that file.

hostname:port:database:username:password

###Change permission chmod 600 .pgpass

Make a folder for backups

Make a directory named DB_BACKUPS inside the root directory.

###Create a script for dump Create a file named daily_backup at the home directory and write the following code inside that file.

#! /bin/bash

DIR=~/DB_BACKUPS
FILE_NAME=$(date "+%Y-%m-%d")
pg_dump -U user_name -Fc database_name > ~/DB_BACKUPS/database_backup.$FILE_NAME.gz

# delete backup files older than 60 days
OLD=$(find $DIR -type d -mtime +10)
if [ -n "$OLD" ] ; then
    echo deleting old backup files: $OLD
    echo $OLD | xargs rm -rfv
fi

The above example is to take a dump and delete all the files that are older than 10 days.

###Adding a crontab entry For the automated dump add a crontab entry of the script inside the crontab. run crontab -e and write the following in the crontab

1 12 * * * /bin/bash ~/daily_backup