sorry this has taken so long... i don't get much time to work on it, many other projects...
this was fixed, the order in the script was just deleting the old file before there was a new one to check, so that is okay now. (learning curve!)
it turns out that gzip --rsyncable will not work as intended. the rsync to inside a tar, cannot work when it is compressed. so it would have to do the entire process of making a whole compression again, and that load is what i am trying to avoid...
two options...
1. i may be able to do this with the open-source software 'duplicity', which is included in main (debian) repositories, and can be integrated with a lot of work. it is the only one that supports doing this with compression. but it makes one main file and a bunch of incremental files, so a client cannot just download one big backup file (though i could probably script that to re-create a new single file on-demand). it could allow a client to restore to any point in time (set archival time per client: 1 day, 1 week, 1 month, etc.), which would not take up much space because it uses de-dupe. I want this archival feature anyways.
2. rsync + tar. this would make an uncompressed tar backup of the domain and then each day do rsync --delete, which will get rid of the big backup strain and leave a big backup file for clients. the problem is it is not compressed, so will take up more space. (databases will stay compressed)
ideas? maybe i should just do rsync + tar for 1.1.0 and try to make duplicity happen for next release (and myself
also, iMSCP does not backup mail at this point, I think this should be integrated, agree?