General Help & Advice > Linux Support

How to Back Deletes

(1/3) > >>

Hi all,

I have a simple backup policy which sees two drives rsynced weekly. This is great as it works well and has the plus of being both nice and easy.

I would like to make these syncs more regular to avoid data loss, but need some protection from/backup for, deletes.

My first though is to scan the two disks and copy those deleted files - keeping them safe, before the rsync.

Before i start any thing i thought i ask....
Is there a switch in rsync that a I’m not aware of that does this?
....or is there a better option (AKA, what should I do?)?

Let’s suggest I’d like to retain 7days of deletes while rsyncing every day.

Thanks in advance,

Hello Brian - and welcome to the Forum.

Why do you want to back up deleted files?  And why for only seven days? Would it not be better to not delete them until their 7-day usefulness has expired?   

As it happens, the bare command rsync does not remove from the earlier backup any files that you have subsequently deleted - you could check this to confirm it - it merely updates the directories you specify. 

Perhaps the easiest solution is
--- Code: ---
rsync -a --delete Documents Pictures Music .local/share/Trash
--- End code ---
which will back up your specified directories (removing from the backup any files that you have deleted since the last one) and also back up the whole of your Rubbish bin.  You ought to be keeping your Rubbish bin fairly empty anyway - say keeping only the last month's deleted files.  A well-maintained rubbish bin will not take up much backup-space. 


Thanks Keith, and I’m now glad I asked the question :)

I have been obsessing over deletes but realise i should mean changes too. And the reason i care is because i have numerous users uploading and changing files; so I’d like to provide a little window for recovery.  Not that I’ve had that problem yet!

My current 2 drives is “easy” (and yes, I do —delete to keep them the same) and that gives me a roll-back to “last rsync” but nothing in-between nor after. i was wondering how to best provide a more “enterprise” backup option - hopefully without buying extra drives.

I’ve been reading tonight, and while I’ve never got my head around incremental backups - i guess that’s my weekends homework! ;)

Feel free to point me in a better direction or if I’ve worrying about the wrong things, and I’ll report back incase anybody else cares....

Thanks again,

(and yes, I do --delete to keep them the same)   Good - So saving the Rubbish bin would solve the deleted problem. 

If you have "numerous users" then it might be advisable to save daily backups for each user to appropriately-named files, keeping only, say, the last ten days' backups for each user.  And it wouldn't need to be an incremental backup. 

Approximately how much data do you save each time for all your users?  And do you save their entire Home directory?  If so then using "tar/gz" on Home would save the whole thing in a compacted form to save space.  {Search "linux tar command" for examples}

If you write a script for doing the backups the file-date naming could be done automatically.  If the users are all users of the same PC then the usernames could be incorporated automatically also in the backup filename.  e.g. Backups/Fred/2021-05-22.  Indeed, by using a "cron job" you could automate the entire process, although starting with a simple method might wise!


Mad Penguin:
Hi Brian,

For what it's worth, you might like to take a look at the following script which I run daily over a bunch of hosts. It's something that's evolved over time but provides what seems to be a relatively good incremental backup approach using rsync. Specifically it maintains a folder called "current" which "should" be a mirror of your target, then a bunch of other (dated) folders which contain changes / files updates as of the backup on that date.(9 times out of 10 I want access to the backup because I've inadvertently deleted something I still want .. :) )

--- Code: ---
BACK=`date "+%A_%d_%B_%Y_%H"`
OPTS="--sparse --force --ignore-errors --delete-excluded --exclude-from=${HERE}/${EXCL}
      --bwlimit=4096 --delete --backup --backup-dir=${HERE}/${BACK} -rvlpgoDtO"
mkdir -p ${HERE}/current
touch ${HERE}/${EXCL}
nice -n19 rsync ${OPTS} [email protected]${HOST}:/ ${HERE}/current
--- End code ---

In addition it ignores paths listed in the file ".excludes" when ".excludes" is stored in the root of the destination folder tree, for example;

--- Code: ---
--- End code ---

Things you might want to tune would include HERE (destination path), 4096 (which is limiting the bandwidth usage) .. call with;

--- Code: ---
./backup_script (host)
--- End code ---

This tries to backup the entire target, but is filtered based on paths in .excludes.
It's "as-is" and use at your own risk, but either way it might help re; ideas.

---- Amended ----
Just to clarify a little, the backup for "this" server produces a directory structure like this;

--- Code: ---
$ ls /vols/backup/servers/legacy/
current                Monday_17_May_2021_01    Sunday_16_May_2021_01    Tuesday_11_May_2021_01
Friday_14_May_2021_01  Saturday_15_May_2021_01  Thursday_13_May_2021_01  Tuesday_18_May_2021_01
Friday_21_May_2021_01  Saturday_22_May_2021_01  Thursday_20_May_2021_01  Wednesday_19_May_2021_01
--- End code ---
So "current" contains my current backup, "Saturday_22_May_2021_01" contains all the files changed or deleted during the most recent backup, etc ... :)


[0] Message Index

[#] Next page

Go to full version