Re: using Google Cloud for virtual tapes
Am 27.06.22 um 10:21 schrieb Stefan G. Weichinger: Am 03.06.22 um 09:13 schrieb Stefan G. Weichinger: I now at last received credentials to that gcs storage bucket, so I can start to try ... Does it make sense to somehow follow https://wiki.zmanda.com/index.php/How_To:Backup_to_Amazon_S3 ? I don't find anything mentioning Google Cloud in the Wiki. *bump* Still wondering. Now with amanda-3.5.3 on github this might also bring updated code for the S3 stuff ... at least Chris told me back than that 3.5.2 would fix something related to my Google-Cloud-needs. I have a gc-service-account key and two bucket names ... and I'd love to hook them up with amanda straight away. Anyone already done that? ps: built 3.5.3 deb packages today in a test VM
Re: using Google Cloud for virtual tapes
Am 27.06.22 um 10:21 schrieb Stefan G. Weichinger: Am 03.06.22 um 09:13 schrieb Stefan G. Weichinger: I now at last received credentials to that gcs storage bucket, so I can start to try ... Does it make sense to somehow follow https://wiki.zmanda.com/index.php/How_To:Backup_to_Amazon_S3 ? I don't find anything mentioning Google Cloud in the Wiki. *bump* Considering to setup vtapes and rclone them to GCP ... Does anyone do something similar? This would need some housekeeping: remove the rcloned vtapes after X runs etc.
Re: using Google Cloud for virtual tapes
Am 03.06.22 um 09:13 schrieb Stefan G. Weichinger: I now at last received credentials to that gcs storage bucket, so I can start to try ... Does it make sense to somehow follow https://wiki.zmanda.com/index.php/How_To:Backup_to_Amazon_S3 ? I don't find anything mentioning Google Cloud in the Wiki. *bump*
Re: using Google Cloud for virtual tapes
Am 05.04.22 um 11:55 schrieb Stefan G. Weichinger: Am 04.04.22 um 23:33 schrieb Chris Hassell: Google cloud works and works well enough indeed. All of them work... but a non-block technique is capped at 5TB by almost all providers, and getting to there is complicated and can DOUBLE your storage cost over one month [only] if you cannot make temporaries without being charged. (looking at you, Wasabi!!). However either A or B or C for Google specifically... A) the desired block size must be kept automatically small (it varies but ~40MB or smaller buffer or so for a 4GB system) ... and each DLE "tape" must be limited in size B) the biggest block size can be used to store 5TB objects [max == 512MB] but the curl buffers will take ~2.5GB and must be hardcoded [currently] in the build. Its too much for many systems. C) the biggest block size can be used but google cannot FAIL EVEN ONCE ... or the cloud upload cannot be restarted and the DLE fails basically. This doesn't succeed often on the way to 5TB. D) the biggest block size can be used but Multi-Part must be turned off and a second DLE and later gets to be very very very slow to add on Option D has NO limits to backups, but it is what needs the O(log N) check for single-stored blocks. This currently does a O(N) check against earlier blocks to check the cloud storage total after file #1 every time. Verrry slow at only 1000 per xaction. @Chris, thanks for these infos. Sounds a bit complicated, I have to see how I can start there. I won't have very large DLEs anyway, that might help. I have ~700GB per tape right now, that's not very much. Although bandwidth (=backup time window) also will be an issue here. I now at last received credentials to that gcs storage bucket, so I can start to try ... Does it make sense to somehow follow https://wiki.zmanda.com/index.php/How_To:Backup_to_Amazon_S3 ? I don't find anything mentioning Google Cloud in the Wiki.
Re: using Google Cloud for virtual tapes
Am 04.04.22 um 23:33 schrieb Chris Hassell: Google cloud works and works well enough indeed. All of them work... but a non-block technique is capped at 5TB by almost all providers, and getting to there is complicated and can DOUBLE your storage cost over one month [only] if you cannot make temporaries without being charged. (looking at you, Wasabi!!). However either A or B or C for Google specifically... A) the desired block size must be kept automatically small (it varies but ~40MB or smaller buffer or so for a 4GB system) ... and each DLE "tape" must be limited in size B) the biggest block size can be used to store 5TB objects [max == 512MB] but the curl buffers will take ~2.5GB and must be hardcoded [currently] in the build. Its too much for many systems. C) the biggest block size can be used but google cannot FAIL EVEN ONCE ... or the cloud upload cannot be restarted and the DLE fails basically. This doesn't succeed often on the way to 5TB. D) the biggest block size can be used but Multi-Part must be turned off and a second DLE and later gets to be very very very slow to add on Option D has NO limits to backups, but it is what needs the O(log N) check for single-stored blocks. This currently does a O(N) check against earlier blocks to check the cloud storage total after file #1 every time. Verrry slow at only 1000 per xaction. @Chris, thanks for these infos. Sounds a bit complicated, I have to see how I can start there. I won't have very large DLEs anyway, that might help. I have ~700GB per tape right now, that's not very much. Although bandwidth (=backup time window) also will be an issue here.
RE: using Google Cloud for virtual tapes
Google cloud works and works well enough indeed. All of them work... but a non-block technique is capped at 5TB by almost all providers, and getting to there is complicated and can DOUBLE your storage cost over one month [only] if you cannot make temporaries without being charged. (looking at you, Wasabi!!). However either A or B or C for Google specifically... A) the desired block size must be kept automatically small (it varies but ~40MB or smaller buffer or so for a 4GB system) ... and each DLE "tape" must be limited in size B) the biggest block size can be used to store 5TB objects [max == 512MB] but the curl buffers will take ~2.5GB and must be hardcoded [currently] in the build. Its too much for many systems. C) the biggest block size can be used but google cannot FAIL EVEN ONCE ... or the cloud upload cannot be restarted and the DLE fails basically. This doesn't succeed often on the way to 5TB. D) the biggest block size can be used but Multi-Part must be turned off and a second DLE and later gets to be very very very slow to add on Option D has NO limits to backups, but it is what needs the O(log N) check for single-stored blocks. This currently does a O(N) check against earlier blocks to check the cloud storage total after file #1 every time. Verrry slow at only 1000 per xaction. > -Original Message- > From: Stefan G. Weichinger > Sent: Monday, April 4, 2022 1:40 AM > To: Chris Hassell ; AMANDA users us...@amanda.org> > Subject: Re: using Google Cloud for virtual tapes > > WARNING: This email originated from outside of BETSOL. Do not click links or > open attachments unless you recognize the sender and know the content is safe. > > > Am 29.03.22 um 15:34 schrieb Chris Hassell: > > Google Cloud is somewhat difficult because they don't fully the support > Amazon S3 operations. One cannot upload blocks and "CopyPart" them into a > larger object. Wasabi and S3 and others can do that. > > > > There needs to be a simple overhaul of the "millions of blocks" upload > technique (non-multipart backups) so that it can be done without O(n) checks > for > every DLE. > > So is it usable? Or do I have to do some combo of local vtapes and amvaulting > them into the cloud, maybe? Confidentiality Notice | The information transmitted by this email is intended only for the person or entity to which it is addressed. This email may contain proprietary, business-confidential and/or privileged material. If you are not the intended recipient of this message, be aware that any use, review, re-transmission, distribution, reproduction or any action taken in reliance upon this message is strictly prohibited. If you received this in error, please contact the sender and delete the material from all computers.
Re: using Google Cloud for virtual tapes
Am 29.03.22 um 15:34 schrieb Chris Hassell: Google Cloud is somewhat difficult because they don't fully the support Amazon S3 operations. One cannot upload blocks and "CopyPart" them into a larger object. Wasabi and S3 and others can do that. There needs to be a simple overhaul of the "millions of blocks" upload technique (non-multipart backups) so that it can be done without O(n) checks for every DLE. So is it usable? Or do I have to do some combo of local vtapes and amvaulting them into the cloud, maybe?
RE: using Google Cloud for virtual tapes
Google Cloud is somewhat difficult because they don't fully the support Amazon S3 operations. One cannot upload blocks and "CopyPart" them into a larger object. Wasabi and S3 and others can do that. There needs to be a simple overhaul of the "millions of blocks" upload technique (non-multipart backups) so that it can be done without O(n) checks for every DLE. > -Original Message- > From: owner-amanda-us...@amanda.org us...@amanda.org> On Behalf Of Stefan G. Weichinger > Sent: Friday, March 25, 2022 3:59 AM > To: AMANDA users > Subject: using Google Cloud for virtual tapes > > WARNING: This email originated from outside of BETSOL. Do not click links or > open attachments unless you recognize the sender and know the content is safe. > > > At a customer I have to somehow move the backups into Google Cloud (the > company moves everything there). > > Does anyone already combine that with amanda somehow? > > They forwarded me this: > > https://nam11.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.cl > oudbooklet.com%2Fgsutil-cp-copy-and-move-files-on-google- > cloud%2Fdata=04%7C01%7Cchris.hassell%40betsol.com%7C63c69d4903 > 324b7b3f1208da0e465c3a%7Cb0fb22a6306043889a97cdfc342994d8%7C0%7C0 > %7C637837992636124495%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAw > MDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000sdata > =ZTcka48%2BnucMKSihdeJPE7xw6x2yl16XtZe2K4zmOww%3Dreserved=0 > > I could think of using amvault to *copy* vtape content there. > > And I find "Cloud Storage FUSE" which allows to mount such a storage bucket. > > Does anyone here have experience with this and Amanda backups? > > thanks, regards, Stefan Confidentiality Notice | The information transmitted by this email is intended only for the person or entity to which it is addressed. This email may contain proprietary, business-confidential and/or privileged material. If you are not the intended recipient of this message, be aware that any use, review, re-transmission, distribution, reproduction or any action taken in reliance upon this message is strictly prohibited. If you received this in error, please contact the sender and delete the material from all computers.