Hallo,
I tried to find a solution to generate backups on the fly (without storing them local) on an ftp backup space. So I wrote this small script which is able to do that.
ATTENTION:
Do not use this script in productive environment or with an unstable FTP Server location (e.g. laggy cloud services). The script does not give a feedback on errors (there is only one stored on the FTP Server - so if its is broken there is no other backup)! Use at your own risk!
ATTENTION 2:
I switched to Syncovery CL for Linux so I am no longer supporting/maintaining this script. If you are using this script and want more (like incremental backups or SFTP/WebDav support) you my check out the new backup soltion I am using: click here.
Features:
- generates compressed .tar.gz archives on a ftp backups space during compression (no local space needed)
- wildcards for include and exclude paths
There are still some features missing so maybe one of you can help me to improve this script:
Encrypt the backups with passwordAdd SFTP supportBackup version history (delete only not needed backups from the ftp server)detect errors and send an email if there is one or more errors
Version 1.0:
- #!/bin/sh
- # Copyright 2015 by Stefan 'UncleSam' Ruepp
- # Licensed under CC - http://creativecommons.org/licenses/by/3.0/legalcode
- # Use at your own risk
- #
- # Versions:
- # 1.0
- # - First release at i-mscp.net
- #
- # Requirements:
- # - ncftp (apt-get install ncftp)
- #
- # Tested with:
- # Ubuntu Server 12.04 LTS and 14.04 LTS
- #
- # It is prefered to run this script using crontab.
- # The following line shows an example to run it every day at 00:00 o'clock without any output
- # To generate an output you could use e.g. the "tee" command or let the output be sent to your email account by crontab
- # 0 0 * * * /root/backup.sh 2&>1 >/dev/null
- #
- # ATTENTION:
- # This script deletes and recreates the folder set by "FTPPATH" on EVERY backup run to clean old backups. There is no version history nor a setting how
- # much backups should be kept. So make sure that this folder is unique because the script is not checking what it deletes!
- # (It will delete everything in the "FTPPATH" before storing the new backup file in it!!)
- # e.g. if you set the FTPPATH to "/backups/imscp/" the script will delete the folder "imscp" on every start and creates it again before storing files in it.
- # Configuration part:
- TARNAME="imscp" # name of the generated tar file e.g. TARNAME.tar.gz
- FTPUSER="" # username of your backup account
- FTPPASS="" # password of your ftp account
- FTPHOST="" # host address of your ftp account
- FTPPATH="" # path inside the ftp account e.g. "/backups/imscp/" - this should be UNIQUE! (read the "ATTENTION" part above!!!)
- LOCALBACKUP="/home/ /root/ /var/www/virtual/*/backups/ /etc/ /var/mail/ /var/www/imscp/gui/plugins/" # paths to save in the backup (* is allowed)
- EXCLUDEPATHS="/etc/apache2/*/access.log" # paths to exclude from the backup (* is allowed)
- # DO NOT CHANGE BELOW!!! (Script starts here)
- #remove previouse generated file list
- echo -n "Removing old backup file lists..."
- rm /tmp/tar_$TARNAME.lst
- rm /tmp/tar_exclude_$TARNAME.lst
- echo "Done"
- #get ignore files
- echo -n "Getting list of ignored folders..."
- touch /tmp/tar_exclude_$TARNAME.lst
- find $EXCLUDEPATHS -type f -print > /tmp/tar_exclude_$TARNAME.lst
- echo "Done"
- #find all files (with wildcards)
- echo -n "Getting all files to backup..."
- find $LOCALBACKUP -type f -print | grep -vf /tmp/tar_exclude_$TARNAME.lst > /tmp/tar_$TARNAME.lst
- echo "Done"
- echo "Start backup process in 10 seconds - to abort press CRTL+C"
- sleep 10
- # remove old backup file
- echo "Deleting old backups on the FTP server..."
- ncftp -u $FTPUSER -p $FTPPASS $FTPHOST << EOF
- rm -rf $FTPPATH
- mkdir $FTPPATH
- exit
- EOF
- # store new backup on the fly
- echo "Creating new backup on the fly"
- tar cfvj - -T /tmp/tar_$TARNAME.lst | ncftpput -m -c -u $FTPUSER -p $FTPPASS $FTPHOST $FTPPATH/$TARNAME.tar.bz2
- echo "Backup finished"