Backups

  • that's great, jox, thanks! i had actually been wanting to do something like that for the same reasons as you, so i have put it in mine and am almost done with the revamped backup script to use gzip -rsyncable. will post here when done, would be nice to get it in in time for the 1.1 release :)

  • hey guys,


    it's all pretty much ready, except for 1 thing...


    for my 'if' i need to use '-e' to check if a file exists, but it seems as though the script doesn't have permissions to check in the backups folder, when it is not running a CMD.


    what is the best way to solve this? i need to do: if ( -e "$bkpDir/$_*.gz") {


    even when i use a full path and filename, it cannot check in the backups folder, but other folders on the system, it can check fine, and the rest of my script works as planned.


    what permissions does the imscp-backup-all script run under? and is it different when i run it as root from shell versus when the imscp_daemon runs it?

    Edited once, last by anarking ().

  • normally the backup script is run with root privileges - so there's no permission problem.
    (it's startet by cron with root... - but you can alos run it manually with root...)


    /J

  • sorry this has taken so long... i don't get much time to work on it, many other projects...


    this was fixed, the order in the script was just deleting the old file before there was a new one to check, so that is okay now. (learning curve!)


    it turns out that gzip --rsyncable will not work as intended. the rsync to inside a tar, cannot work when it is compressed. so it would have to do the entire process of making a whole compression again, and that load is what i am trying to avoid...


    two options...


    1. i may be able to do this with the open-source software 'duplicity', which is included in main (debian) repositories, and can be integrated with a lot of work. it is the only one that supports doing this with compression. but it makes one main file and a bunch of incremental files, so a client cannot just download one big backup file (though i could probably script that to re-create a new single file on-demand). it could allow a client to restore to any point in time (set archival time per client: 1 day, 1 week, 1 month, etc.), which would not take up much space because it uses de-dupe. I want this archival feature anyways.


    2. rsync + tar. this would make an uncompressed tar backup of the domain and then each day do rsync --delete, which will get rid of the big backup strain and leave a big backup file for clients. the problem is it is not compressed, so will take up more space. (databases will stay compressed)


    ideas? maybe i should just do rsync + tar for 1.1.0 and try to make duplicity happen for next release (and myself ;)


    also, iMSCP does not backup mail at this point, I think this should be integrated, agree?

    Edited once, last by anarking ().