How to add files form moodle to AWS S3 Bucket?
+1 for Howard's question ... you might need to provide some information about how you are hosted (and maybe where - provider), operating system, etc.
Yes ... an Amazon S3 Bucket could be a mountpoint on Linux systems and there are quite a few blogs/resources that describe how to do that.
IF ... IF ... you are thinking of using that bucket for *all* of moodledata, you might need to rethink due to the frequency moodle code accesses and how it uses folders/directories in moodledata.
Qualifications for this response: don't use Amazon's bucket but do use Google's Bucket ... similar ... but I do NOT use the bucket as the location of the entire moodledata directory ... nor moodledata/filedir/
Nice thing about buckets ... they can be mounted on multiiple systems not on the same network + the size ... my Google Bucket, as an example, is 1P - and affordable! ;)
A short hint on whether moving all of moodledata would be a good idea or not, and why, would be appreciated, as well as what could be "safely" moved (e.g., is there a way to move only the automated backup files to a separate storage?)
To your side-question, "e.g., is there a way to move only the automated backup files to a separate storage?" Sure!
Site administration > Courses > Backups > Automated backup setup: Automated backup storage (backup | backup_auto_storage) = Specified directory for automated backups
-dito- Save to (backup | backup_auto_destination) = /path/to/a/directory
Ok, what I shared was about a Google Bucket (GB) ... remote mass storage ... that's not an Amazon S3 Bucket.
Am gonna asume that an Amazon S3 Bucket could be a mount point on a linux server hosted else where. My GB mount is on a Rackspace hosted server as well as on a Google Compute Engine instance.
Think the issue with all of moodledata on the bucket (Google or Amazon) would be speed. Moodle caches (has 3 folders related to caches) and has a folder for sessions in moodledata/ Would be best, I think, if those were on the local server. filedir (which is a subdirectory of moodledata and where are files uploaded are stored) *could* be a mount point that points to the bucket - knowing it would be slower than if filedir were on the local moodledata/ directory partition. No space issues ... but speed could be an issue (that 'speed' would also involve 'networking' so if the bucket server is 25 hops away from your server, probably not a good fit.
Don't think it would be too hard to test on a dev system as buckets should be able to be mounted on any linux system. If one installed the 'Benchmark' plugin to see what dev system looked like with all of moodledata on local, then change to the moodledata/filedir/ being the mount point for the bucket ... run the benchmark again. It will be slower ... and benchmark will warn (increase your score and tell you it's got problems) but does it? Is the 'slowness' acceptable? 1.0 seconds to 1.9 seconds ... something like that.
My 2 cents!
Follow up ... my response not quite on target ... didn't mention auto backups, but same idea ...
mount point: /autobackups - points to the Amazon Bucket.
Moodle will still use moodledata/temp/backup/ to build the course backups. Last thing the code does is 'copy' the .mbz file in moodledata/temp/backup/[somehashnameddirectory]/ to the destination ... in this example that would be /autobackups/ ... the mount point for the Amazon Bucket. Moodle then cleans up moodledata/temp/backup/X ... so there is a burst of space usage there for a little while. Obviously if you are really in a bind for space (drive 99% full) then you'll have to find space.
I do use my Google Bucket in that fashion ... also use it for site backups. Nice thing about that ... if one of my servers crashes and I have to spin up another, I can set up the GB mount on new server and have both the auto course backups and the site backups to restore.