Archive

Archive for the ‘Backup’ Category

Backup All MSSQL Databases

July 26, 2011 Leave a comment

The following sql code takes backup of all MSSQL databases to “D:\database\”, and all these backup files will be having a “*.BAK” extension. Save this file as “D:\database\Backup.sql”.

    DECLARE @name VARCHAR(50) -- database name
    DECLARE @path VARCHAR(256) -- path for backup files
    DECLARE @fileName VARCHAR(256) -- filename for backup
    DECLARE @fileDate VARCHAR(20) -- used for file name

    SET @path = 'D:\database\'

    SELECT @fileDate = CONVERT(VARCHAR(20),GETDATE(),112)

    DECLARE db_cursor CURSOR FOR
    SELECT name
    FROM master.dbo.sysdatabases
    WHERE name NOT IN ('master','model','msdb','tempdb')

    OPEN db_cursor 
    FETCH NEXT FROM db_cursor INTO @name 

    WHILE @@FETCH_STATUS = 0 
    BEGIN 
           SET @fileName = @path + @name + '_' + @fileDate + '.BAK'
           BACKUP DATABASE @name TO DISK = @fileName

           FETCH NEXT FROM db_cursor INTO @name 
    END 

    CLOSE db_cursor 
    DEALLOCATE db_cursor

Note down the system name(from msinfo32). In my case the system name was “WINDOWS”, replace this with your system name 😉 . Save the following code as a batch file named “D:\database\ftpbackup.bat”. The script will upload all these database backup to a directory named “backup/windows1_bd” in the remote ftp server.

    sqlcmd -S WINDOWS -E -i D:\database\Backup.sql  > backup.log
    dir /a /b /-p /o:gen *.bak > file.txt
    echo open 67.20.99.43 > ftp.bat
    echo servhost  >> ftp.bat
    echo password >> ftp.bat
    echo cd backup/windows1_bd >> ftp.bat
    echo bin  >> ftp.bat
    echo hash >> ftp.bat
    for /F "eol=; delims=," %%i in (file.txt) do (
    @echo put %%i >> ftp.bat
    )
    echo bye >> ftp.bat
    C:\Windows\System32\ftp -s:ftp.bat >> backup.log
    del file.txt file.bat
Categories: Backup

Linux Cpanel Backup to Amazon S3

March 22, 2011 70 comments

In this article i will explain how to take cpanel backup to amazon s3(with backup rotation enabled). The step by step procedure is explained below,

Step1) Activate an account in s3. You will get an access key and secret_key after the activation.

You can create a new s3 account by following the url,

Step2) Install s3 client for linux. The package name is “s3cmd-1.0.0-4.1”.

root@heuristics:~# apt-get install s3cmd

On redhat or centos based machines(using rpm packages), you can install “s3cmd” as follows,

cd /etc/yum.repos.d
wget http://s3tools.org/repo/CentOS_5/s3tools.repo
yum install s3cmd

Alternatively, you can download it from the url pasted below:

Step3) Configure s3 client using the command,

root@heuristics:~# s3cmd --configure

It will ask for the access key and secret key that we got during our account activation. This process reports failure, if we provide the wrong key values. Once this step is completed,  the configuration will be stored inside the file “/root/.s3cfg”.

During configuration you will be asked whether to enable encryption or not. Enabling encryption will improve the security of transfer but will make the upload a little bit slower.

Step4) We need to create buckets in s3 for storing the backup.

eg: creating a bucket named “Backup_daily”,

root@heuristics:~# s3cmd mb s3://Backup_daily

For additional options refer the url,

Step5) Enable daily backup from WHM. Refer the url pasted below for reference,

If backup is already configured, then we can know the location of the backup using the command,

root@heuristics:~#grep BACKUPDIR /etc/cpbackup.conf
BACKUPDIR /backup
root@heuristics:~#

Inside “/backup” there will be another directory named “cpbackup”, which will be holding the daily,weekly and monthly backup’s. In my case,

root@heuristics:~# ls /backup/cpbackup/
./  ../  daily/  monthly/  weekly/
root@heuristics:~#

Step6) Create log directories,

root@heuristics:~# mkdir /var/log/backuplogs
root@heuristics:~#

Step7) Write a script to automate the backup, and save it as “/root/dailybackup.sh” . In the script pasted below, the backup rotation degree is set as 3(“DEGREE=3” , line16). This means that, 3 days old backup will be deleted automatically. You can increase this backup retention period by adjusting the “DEGREE” variable in line16.

#!/bin/bash

##Notification email address
_EMAIL=your_email@domain.com

ERRORLOG=/var/log/backuplogs/backup.err`date +%F`
ACTIVITYLOG=/var/log/backuplogs/activity.log`date +%F`

##Directory which needs to be backed up
SOURCE=/backup/cpbackup/daily

##Name of the backup in bucket
DESTINATION=`date +%F`

##Backup degree
DEGREE=3

#Clear the logs if the script is executed second time
:> ${ERRORLOG}
:> ${ACTIVITYLOG}

##Uploading the daily backup to Amazon s3
/usr/bin/s3cmd -r put ${SOURCE} s3://Backup_daily/${DESTINATION}/ 1>>${ACTIVITYLOG} 2>>${ERRORLOG}
ret2=$?

##Sent email alert
msg="BACKUP NOTIFICATION ALERT FROM `hostname`"

if [ $ret2 -eq 0 ];then
msg1="Amazon s3 Backup Uploaded Successfully"
else
msg1="Amazon s3 Backup Failed!!\n Check ${ERRORLOG} for more details"
fi
echo -e "$msg1"|mail -s "$msg" ${_EMAIL}

#######################
##Deleting backup's older than DEGREE days
## Delete from both server and amazon
#######################
DELETENAME=$(date  --date="${DEGREE} days ago" +%F)

/usr/bin/s3cmd -r --force del s3://Backup_daily/${DELETENAME} 1>>${ACTIVITYLOG} 2>>${ERRORLOG}

Step8) Grant execute privilege for the script and schedule it to run everyday,

root@heuristics:~# chmod u+x /root/dailybackup.sh
root@heuristics:~# cp -p /root/dailybackup.sh /etc/cron.daily/
root@heuristics:~#

NOTE:

Or if you wish to start the amazon s3 backup script right after the cpanel backup process, then create a cpanel post backup hook named “/scripts/postcpbackup” with the following contents,

#!/usr/bin/perl
system(“/root/dailybackup.sh”);

The post backup hook will start the amazon s3 backup script right after every cpanel backup completion.

In case of disasters we can download the backup from the bucket using the same s3cmd tool.

root@heuristics:~# mkdir restore
root@heuristics:~# s3cmd -r get s3://Backup_daily/2011-02-32  restore
Categories: Amazon s3, Backup, Cpanel/WHM