How to Back Deletes

Hi all,

I have a simple backup policy which sees two drives rsynced weekly. This is great as it works well and has the plus of being both nice and easy.

I would like to make these syncs more regular to avoid data loss, but need some protection from/backup for, deletes.

My first though is to scan the two disks and copy those deleted files - keeping them safe, before the rsync.

Before i start any thing i thought i ask…
Is there a switch in rsync that a I’m not aware of that does this?
…or is there a better option (AKA, what should I do?)?

Let’s suggest I’d like to retain 7days of deletes while rsyncing every day.

Thanks in advance,

Hello Brian - and welcome to the Forum.

Why do you want to back up deleted files? And why for only seven days? Would it not be better to not delete them until their 7-day usefulness has expired?

As it happens, the bare command rsync does not remove from the earlier backup any files that you have subsequently deleted - you could check this to confirm it - it merely updates the directories you specify.

Perhaps the easiest solution is

rsync -a --delete Documents Pictures Music .local/share/Trash

which will back up your specified directories (removing from the backup any files that you have deleted since the last one) and also back up the whole of your Rubbish bin. You ought to be keeping your Rubbish bin fairly empty anyway - say keeping only the last month’s deleted files. A well-maintained rubbish bin will not take up much backup-space.


Thanks Keith, and I’m now glad I asked the question :slight_smile:

I have been obsessing over deletes but realise i should mean changes too. And the reason i care is because i have numerous users uploading and changing files; so I’d like to provide a little window for recovery. Not that I’ve had that problem yet!

My current 2 drives is “easy” (and yes, I do —delete to keep them the same) and that gives me a roll-back to “last rsync” but nothing in-between nor after. i was wondering how to best provide a more “enterprise” backup option - hopefully without buying extra drives.

I’ve been reading tonight, and while I’ve never got my head around incremental backups - i guess that’s my weekends homework! :wink:

Feel free to point me in a better direction or if I’ve worrying about the wrong things, and I’ll report back incase anybody else cares…

Thanks again,

(and yes, I do --delete to keep them the same) Good - So saving the Rubbish bin would solve the deleted problem.

If you have “numerous users” then it might be advisable to save daily backups for each user to appropriately-named files, keeping only, say, the last ten days’ backups for each user. And it wouldn’t need to be an incremental backup.

Approximately how much data do you save each time for all your users? And do you save their entire Home directory? If so then using “tar/gz” on Home would save the whole thing in a compacted form to save space. {Search “linux tar command” for examples}

If you write a script for doing the backups the file-date naming could be done automatically. If the users are all users of the same PC then the usernames could be incorporated automatically also in the backup filename. e.g. Backups/Fred/2021-05-22. Indeed, by using a “cron job” you could automate the entire process, although starting with a simple method might wise!


Hi Brian,

For what it’s worth, you might like to take a look at the following script which I run daily over a bunch of hosts. It’s something that’s evolved over time but provides what seems to be a relatively good incremental backup approach using rsync. Specifically it maintains a folder called “current” which “should” be a mirror of your target, then a bunch of other (dated) folders which contain changes / files updates as of the backup on that date.(9 times out of 10 I want access to the backup because I’ve inadvertently deleted something I still want … :slight_smile: )

BACK=`date "+%A_%d_%B_%Y_%H"` 
OPTS="--sparse --force --ignore-errors --delete-excluded --exclude-from=${HERE}/${EXCL} 
      --bwlimit=4096 --delete --backup --backup-dir=${HERE}/${BACK} -rvlpgoDtO"
mkdir -p ${HERE}/current 
touch ${HERE}/${EXCL} 
nice -n19 rsync ${OPTS} root@${HOST}:/ ${HERE}/current

In addition it ignores paths listed in the file “.excludes” when “.excludes” is stored in the root of the destination folder tree, for example;


Things you might want to tune would include HERE (destination path), 4096 (which is limiting the bandwidth usage) … call with;

./backup_script (host)

This tries to backup the entire target, but is filtered based on paths in .excludes.
It’s “as-is” and use at your own risk, but either way it might help re; ideas.

---- Amended ----
Just to clarify a little, the backup for “this” server produces a directory structure like this;

$ ls /vols/backup/servers/legacy/ 
current                Monday_17_May_2021_01    Sunday_16_May_2021_01    Tuesday_11_May_2021_01 
Friday_14_May_2021_01  Saturday_15_May_2021_01  Thursday_13_May_2021_01  Tuesday_18_May_2021_01 
Friday_21_May_2021_01  Saturday_22_May_2021_01  Thursday_20_May_2021_01  Wednesday_19_May_2021_01

So “current” contains my current backup, “Saturday_22_May_2021_01” contains all the files changed or deleted during the most recent backup, etc … :slight_smile:

Thanks for sharing Keith, as I got caught up in real life so haven’t had time to do what I hoped.

Reading your script, It looks like we have similar concerns - mine being for a NAS (and to answer one of your questions ~3tb) , which also has a “./servers/${host}” setup as you describe - and it’s those I need to backup; which includes config, shared files as well as application data from all connected devices. The start of this was buying some shiny new drives - so I should have space enough to consider incremental backups as you seem to have already implemented.

I follow you’re script (although “nice” is new to me, but nothing --help can’t fix :slight_smile: ) I’ll have a play and read of the options being suggested.

Very much appreciated.

Hi again…

So - it looks like all I need to do is use --backup with --backup-dir= … and then find somewhere to store them. Oh! as it so happens, having bought new drives, I do have 3tb spares sat around looking for a job! - Sounds like enough space for the incremental, while also keeping my mirror copy intact!

I don’t currently use as many of the options as you list: (only “-avz --delete”)
So looking at what I’m missing out on:
sparse - “Sparse files are files that have large amounts of space preallocated to them” - this sounds like an option worth including
force - good shout (to delete directories even when not empty) and something I’ll adopt
ignore-errors - I’ve read in other forums that this also ensures files are correctly deleted, so that’s a good one for me to add too
bwlimit - I don’t think this is relevant in my case, because the files are already local
rvlpgoDtO - I currently use “-av” which takes care of most of these options (but I agree, calling them out individually is cleaner)…

However, “-O” - I can’t find anything to neatly describe what this actually does.

The only extra one that I use if -z (compress file data during the transfer) - I assume due to the omission in your usage, that you don’t think that’s necessary.

The script was from the Mad Penguin, who’s an ace at that kind of thing.
As for nice: it’s a way of specifying what importance you attach to a job (how nicely you want to treat your computer!). The man pages offer a brief explanation.

I don’t use all those options, either. The -O option is
“-O, --omit-dir-times = omit directories from --times” where
“-t, --times = preserve modification times”
I could give that a miss, too.

Looks like you’ve got it all sussed - thanks for keeping us informed of progress.


Hi Brian,

I think -O means that if you do something in the target folder that amends the modification time for the folder, it won’t necessarily overwrite the timestamp during the next backup. Bear in mind this script has been growing (from many sources) over a period of maybe 10 years … at some point something somewhere suggested this and it appears to do no harm … :slight_smile:

In terms of “-z”, I find that over a local network, -z slows things down, and over a wide area network (broadband) it speeds things up. In this instance I’m running over a VPN connection which does implicit compression anyway, so doing it a second time has no effect other than adding to CPU usage … :slight_smile:

Yeah, specifically;

nice 19 <command>

Will effectively only execute the command using “spare” CPU cycles, i.e. it should have zero impact on other processes running on the same machine. Generally speaking, rsync will use a bit of CPU, especially if you add “-z”, which may impact for example on your desktop (or game :slight_smile: ) performance. Using “nice” just means you can run this from “cron” literally without noticing. (sort of “bwlimit” for the CPU)

Thanks for the info - These backups are for internal drives, so “-z” sounds pointless.

…but I like the sound of “NICE” - I’ll have a play, because pretty much all of my processes are far from urgent - mostly CRON or scripts for startup.

thanks again.

PS - that “-backup” works just as expected; I have re-employed my old 3tb drive, which is slowly filling up with daily directories of change :slight_smile: