Security and privacy

Backup policy

 
Picture of Ken Task
Re: Backup policy
Particularly helpful Moodlers

Sounds like a job for progressive with delete rsync only on the filedir directory of moodledata to a large archive drive.

First dry run to see how long and how large.   Then run for real.   First run will take a long time and acquire it all, but progressive with delete from that point on would update the 'backup' with only files that are new and the 'delete' would compare what's in filedir with what is archived and if a flie has been deleted in filedir, the same file would be deleted in the rsync'd drive.

See man rsync.

'spirit of sharing', Ken


 
Average of ratings: Useful (1)
You either love it or you hate it
Re: Backup policy
Particularly helpful Moodlers

Can i come back to this?

Currently, we have a perl script running from cron that collects the moodle data directory from our NFS server and sends it to our "Swift" backup server??

I haven't seen the script as I have no access to the swift server!! Why I ave no idea?

Anyway, how would we do a progressive rsync with delete on such a system?

So, we would need to backup the whole Moodledata directory on our NFS Share (Now only 1,2GB).  We then need to collect this backup from the NFS server using Rsync and compare it to what? Bearing in mind we would want multiple copies, i.e. daily, weekly, monthly


Sorry I am a little confused


Cheeeers Ken

 
Average of ratings: -
Picture of Ken Task
Re: Backup policy
Particularly helpful Moodlers

Did say 'it sounds like'! ;)

Hmmmm ... no Vulcan Mind Meld possible ... perl script?  If you haven't seen it, then I certainly have no knowledge of it either.

Are you asking for an example rsync command?

man rsync is your friend there ... but ... will share one I used this week.  Dry run shows what it would have done - so brief look at what files/folders it was working with and also important the summary at the end ... how long it took, total of files/sizes, etc.

[root@sos backup]# cat syncbucketdry
rsync -avzh --dry-run --progress --delete ./ /root/gcloud* /mnt/gbucket/

command + options rsync -avzh --dry-run --progress --delete

man rsync will show all the options/switches, etc. ... no sense re-inventing the wheel here.

source ./ /root/gcloud*

I was issuing from /home/backup/ [the ./] that contained tar balls of code/data and sql dumps in m## directories and etc. ... ie, local server backups.   Also dry run rsync'd /root/gcloud* files and directories.

[root@sos home]# du -h ./backup
256K    ./backup/webmin
70M    ./backup/mysql
6.7G    ./backup/m27
16G    ./backup/m30
485M    ./backup/blog
12M    ./backup/m32/auto
3.8G    ./backup/m32
1.2G    ./backup/m31/courses
16G    ./backup/m31
103M    ./backup/webmindb
12G    ./backup/unirepo
17G    ./backup/m33
13G    ./backup/m34
6.9G    ./backup/m35
92G    ./backup

destination /mnt/gbucket/ (a Google Bucket -  1.0P - that's Pentabyte)

So let's say I ran backups of a moodle35 site which would then put new tar balls and sql dump in /home/backup/m35/.   The --delete would pick up new files not present before and xfer them to the bucket.  So if I deleted from /home/backup/m35/ all previous tar balls and sql dump, the rsync would remove those deleted files in /mnt/gbucket/m35 and transfer only the new.

That's clear as mud, isn't it! :|  Again, man rsync is your friend.

And rsync does require some study and dry runs to get it right.

One doesn't need multiple copies ... I run an rsync to dup a moodledata directory on another Moodle site once a day.   It's a school district site so almost all summer long there were hardly any changes.  With school about to begin another academic year, teachers have been into their courses and adding/deleting, etc. so here recently there has been more changes.

How is one assured that rsync is working ... comparisons of what was rsync'd to the location of the archive?

An example command was provided:

[root@sos home]# du -h ./backup show me totals on the server to be backed up - shown above.

Do the same command on destination location.

Now if one insist on daily/weekly/monthly copies ... the destination better be 1.0P at least ... i would think.  And maybe 3 rsync scripts (maybe with different options) ... one for daily, one for weekly, one for monthly.

That help any?

'spirit of sharing', Ken



 
Average of ratings: -