Page 1 of 1

Backup script with TAR / incremental backup with simple FTP to another location and email status

Posted: 2010/01/15 23:50:26
by hansolo
Hi,

first I apologize if I posted this to wrong forum, please moderators move it if useful to somewhere appropriate.

After I spent some time discovering The BIG BANG of Universe and The Meaning of Life : lol :-)
I managed somehow to create a script to make some [b]backup of files[/b] on server and TAR it
and then [b]FTP the archive to another location[/b] FTP server and then [b]emails result[/b].
It also measures time needed to complete it and deletes archive older than XX days (set in find -mtime +20)
and makes [b]incremental[/b] backup every weekday and [b]FULL on Sundays[/b] (which suits me bcoz no heavy load).

Files for TAR to include and exclude are in txt files listed each line separate name:
file: [i]including.txt[/i]:
[code]
/var/
/etc/
/home/
[/code]

file: [i]excluding.txt[/i]:
[code]
proc
*_log*
var/tmp
var/lib/bluetooth
var/lib/cs
var/lib/dav
var/lib/dbus
var/lib/dhcpv6
var/lib/dovecot
var/lib/games
var/lib/rpm
var/lib/webalizer
var/lib/yum
var/tmp/arhiv_serverja
var/log
var/run
var/www/manual
var/yp
var/lib/php/session
spool
cache
*zip
*gz
etc/rc*
home/httpd/manual
rpm
[/code]


I used [b]LFTP[/b] to make sure that FTP transaction runs complete since ftp didn't always finish, the script for transferring the BACKUP file:
[b]lftpscript.sh[/b]
[code]
#!/bin/sh
HOST='ftp.domain.com'
USER='ftpuser'
PASSWD='idontknow'
FILE=$(cat archivename.txt)

lftp -c "open $HOST && user $USER $PASSWD && cd FOLDER_NAME_FOR_STORING/backups/ && put $FILE" <<END_SCRIPT
# all in one command that connects to HOST=ftp.domain.com with Username/password
# and changes directory to whatever you need
# then transfers the file with the name of created archive in archivename.txt (ex. BACKUP-2009-12-15-Fri-01-15-01h.tgz)
bye
exit
END_SCRIPT
exit 0
[/code]

I don't know if it is enough to make safe your server in case of crash but that is why I like to share this to make it better...

Please make suggestions to make the script near 100% safe in case of server crash (like I would need it bcoz I have
to maintain 2 servers with no RAID, just 1 HDD running WWW and e-mail) and to make corrections if something not explained well.

This is the script I had written to work for me (you will have to modify it for yourself, I hope you find what and where),
since it runs every day in CRON , I have made a directory [ /usr/tmp/server_backup ] with root access only and put script in:

running Crontab the script:

PATH=/sbin:/bin:/usr/bin:/usr/sbin:/usr/local/bin

0 1 * * * archive.sh >/dev/null # at 1.00AM every day

the [b]archive.sh[/b] is in /usr/local/bin:
[code]
#!/bin/bash
cd /usr/tmp/server_backup
./backup.sh # the real script to make backup
[/code]
This only CD to /usr/tmp to run 'real' script

[b]backup.sh[/b]:
[code]
#!/bin/sh

# DELETE archive older than -mtime +'days'
find . -name 'BACKUP*.tgz' -mtime +20 -delete
find . -name 'stopwatch*' -mtime +2 -delete

start1=$(date +"%T h ( %s )")
start=$(date +%s)

# on SUNDAY make FULL backup
if [ $(date +"%a") = "Sun" ]; then
{
SNAPSHOTFILE="./usr-full"; # needed by TAR (GNU-TAR to be precise) which is used to compare for incremental backups
ARCHIVENAME=BACKUP-full-$(date +"%F-%a-%H-%M-%Sh").tgz; # self explaining
rm -f "./usr-full";
}

else
{
ARCHIVENAME=BACKUP-$(date +"%F-%a-%H-%M-%Sh").tgz;
# a name of a backup file: BACKUP-2009-12-15-Fri-01-15-01h.tgz
SNAPSHOTFILE="./usr-1";
cp "./usr-full" "./usr-1";
}
fi

echo $ARCHIVENAME >archivename.txt # need the name to FTP transfer so store it in file archivename.txt which doesn't change (used later in lftpscript.sh ) !
# creating text to send in email
echo "-----------------------" >stopwatch-$archivename.txt
echo "Backup of $ARCHIVENAME" >>stopwatch-$archivename.txt
echo "-----------------------" >>stopwatch-$archivename.txt
echo " " >>stopwatch-$archivename.txt # echo " " makes new line /CR or LF whatever it does
# I do not need this precise time { time tar -T including.txt -X excluding.txt -pczvRf $ARCHIVENAME; } 2>> stopwatch-$ARCHIVENAME.txt >/dev/null
{ tar -T including.txt -X excluding.txt -pczvR --listed-incremental=$SNAPSHOTFILE -f $ARCHIVENAME; } 2>> stopwatch-$ARCHIVENAME.txt >/dev/null

stopped1=$(date +"%T h ( %s )")
stopped=$(date +%s)
ftpstarted=$stopped

thetime=$(($stopped-$start)) # doing some math in shell that's why $()

echo " " >>stopwatch-$ARCHIVENAME.txt
echo -n "File Size (Byte-s) : " >>stopwatch-$ARCHIVENAME.txt
ls -al "$ARCHIVENAME" | cut -f 5 -d ' ' >>stopwatch-$ARCHIVENAME.txt
# this part | cut -f 5 -d ' ' is sometimes maybe 6 instead of 5, experiment which gives you only the SIZE of the file
echo " " >>stopwatch-$ARCHIVENAME.txt
echo "Started: " $start1 >>stopwatch-$ARCHIVENAME.txt
echo "Stopped: " $stopped1 >>stopwatch-$ARCHIVENAME.txt
echo "Time needed: " $(date -d "1970-01-01 $thetime sec" +"%H:%M:%S") / $thetime "secs" >>stopwatch-$ARCHIVENAME.txt
# outputs: Time needed: 00:03:14 / 194 secs


echo "-----------------------" >>stopwatch-$ARCHIVENAME.txt
echo " " >>stopwatch-$ARCHIVENAME.txt
echo "FTP start:" >>stopwatch-$ARCHIVENAME.txt
# again I dont need exact time procedure { time ./lftpscript.sh; } 2>> stopwatch-$ARCHIVENAME.txt
{ ./lftpscript.sh; } 2>> stopwatch-$ARCHIVENAME.txt

ftpstop1=$(date +"%T h ( %s )")
ftpstopped=$(date +%s)

ftptime=$(($ftpstopped-$ftpstarted))

echo " " >>stopwatch-$ARCHIVENAME.txt
echo "Start of FTP: " $stopped1 >>stopwatch-$ARCHIVENAME.txt
echo "End of FTP: " $ftpstop1 >>stopwatch-$ARCHIVENAME.txt
echo "Time of FTP transfer: " $(date -d "1970-01-01 $ftptime sec" +"%H:%M:%S") / $ftptime "secs" >>stopwatch-$ARCHIVENAME.txt

mail -s "Backup of $ARCHIVENAME" "email address of recipient" <stopwatch-$ARCHIVENAME.txt
#finally email report :-)
[/code]


That's it ... for now ... I hope you find it useful and if not explained good I'll make corrections or you make improved suggestion.
I'd be happy to use to protect my servers

Backup script with TAR / incremental backup with simple FTP to another location and email status

Posted: 2010/01/18 13:41:50
by hansolo
I would like to ask what does it need to be archived to make 'complete' server backup as in case of HDD failure to succesfully restore to last working state.

Any boot or system files or something else not just data in /home ?

Any comments welcome.

TY

Backup script with TAR / incremental backup with simple FTP

Posted: 2010/01/21 04:34:10
by byebye
thanks for sharing script.

but it needed to change for personal use ,
and i like manually operation :lol:

Re: Backup script with TAR / incremental backup with simple FTP to another location and email status

Posted: 2010/01/21 12:08:09
by TrevorH
I suspect you could have saved yourself some time if you had found 'duplicity' first :-(

Re: Backup script with TAR / incremental backup with simple FTP to another location and email status

Posted: 2010/01/24 13:51:25
by hansolo
@ [b] byebye [/b]:
NP,
as long as it works the way you need it to. :-) I'm glad you managed it



@ [b] TrevorH [/b]:

Indeed.
As I am looking at some how to's and config manuals I will be moving on to Duplicity as soon as I manage it.

If any scripts examples exist I'll be happy to look at...

TY

Re: Backup script with TAR / incremental backup with simple FTP to another location and email status

Posted: 2011/02/26 23:28:46
by liviujianu
Hi hansolo,

I need a little help please
After i run your script, and make backups how do i restore a backup?

Thank you

Re: Backup script with TAR / incremental backup with simple FTP to another location and email status

Posted: 2011/02/27 16:17:02
by hansolo
@ liviujianu :

Hi,

just UNtar the archive file (on your server) like any other with: "# tar xzvf filename.tgz "
BE CAREFUL:
you must CHDIR to a directory where you want to extract your backup (ie. /var/tmp)
otherwise if you are in / your files will be overwritten with files from backup, as it might not be acceptable.

Then just copy needed files extracted from backup to the original place where needs to be
ie.: [i]/var/tmp/extracted-backup/[/i]#cp -v [b][color=FF0000].[/color][/b]/etc/httpd/conf/httpd.conf /etc/httpd/conf/ [b][i]<-- notice the bolded red dot which means
copy from current path extracted backup to /etc [/b][/i]

The whole command would look like this (presumably you use the same paths),
first change to some temporary directory to extract backup files (I made for this example /var/tmp/extracted-backup ):
[code][i]/var/tmp/extracted-backup/[/i]# tar [b]x[/b]zvf /usr/tmp/server_backup/BACKUP-full-'file date you want'.tgz[/code]
then copy the files you need.

Re: Backup script with TAR / incremental backup with simple FTP to another location and email status

Posted: 2011/02/28 02:54:01
by liviujianu
[quote]
hansolo wrote:
@ liviujianu :

Hi,

just UNtar the archive file (on your server) like any other with: "# tar xzvf filename.tgz "
BE CAREFUL:
you must CHDIR to a directory where you want to extract your backup (ie. /var/tmp)
otherwise if you are in / your files will be overwritten with files from backup, as it might not be acceptable.

Then just copy needed files extracted from backup to the original place where needs to be
ie.: [i]/var/tmp/extracted-backup/[/i]#cp -v [b][color=FF0000].[/color][/b]/etc/httpd/conf/httpd.conf /etc/httpd/conf/ [b][i]<-- notice the bolded red dot which means
copy from current path extracted backup to /etc [/b][/i]

The whole command would look like this (presumably you use the same paths),
first change to some temporary directory to extract backup files (I made for this example /var/tmp/extracted-backup ):
[code][i]/var/tmp/extracted-backup/[/i]# tar [b]x[/b]zvf /usr/tmp/server_backup/BACKUP-full-'file date you want'.tgz[/code]
then copy the files you need.[/quote]

Thank You hansolo for the reply. I don`t know why i didn`t even tried tar xzvf to the file.

All the best :-)