Question on dealing with Archives

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

Question on dealing with Archives

duplicity-talk mailing list
Hi everyone, 

I've got a question that I hope hasn't been asked a million times. I searched at this email list's website but the terms were too generic and didn't find what I'm looking for. 

I'm using Duplicity to periodically load unencrypted full backups to my Amazon s3 account. Works great, except I'd love to be able to download those archives and extract them myself.  At the moment, when I download the  duplicity-full.************.diftar.gz I get another archive .difftar that will not properly extract.  

I've tried setting my s3-multipart-chunk-size and volsize to 200MB & 400MB but duplicity still breaks up the volumes? 

I'm backing up small websites, and I'm looking for an fast & easy way to restore them in a worse case scenario where the server blows up.  I can get a new VPS with wordpress setup in seconds then load up the site files using ftp.  

Is it possible to get one gz archive for a full backup? Each backup would be less than 500MB. 

Thanks for any insight! 

Jeff
--

  



_______________________________________________
Duplicity-talk mailing list
[hidden email]
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
Reply | Threaded
Open this post in threaded view
|

Re: Question on dealing with Archives

duplicity-talk mailing list

Hello Jeffrey,

On 2017-06-07 16:46, Jeffrey via Duplicity-talk wrote:

I'm using Duplicity to periodically load unencrypted full backups to my Amazon s3 account. Works great, except I'd love to be able to download those archives and extract them myself.  At the moment, when I download the  duplicity-full.************.diftar.gz I get another archive .difftar that will not properly extract.  
 
[...]
 
I'm backing up small websites, and I'm looking for an fast & easy way to restore them in a worse case scenario where the server blows up.  I can get a new VPS with wordpress setup in seconds then load up the site files using ftp.  

It is not directly answering your question, but it sounds as though you are fighting duplicity a bit instead of using it.

Any reason that you cannot backup to a different subfolder on Amazon for each website that you back up and then use duplicity (essentially the inverse of your backup command) to extract the files if you need them? You should not really be manually extracting archive files unless duplicity cannot do it for you. You can use duplicity to extract the archive files even if you have moved them (say off Amazon onto a folder on your local machine).

Alternatively, if you are just wanting a full backup that you can easily extract, is there a reason not to just do a compressed archive (e.g. a tar.gz)?

Kind regards,

Aaron


 



_______________________________________________
Duplicity-talk mailing list
[hidden email]
https://lists.nongnu.org/mailman/listinfo/duplicity-talk

blocked.gif (164 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Question on dealing with Archives

duplicity-talk mailing list
Thanks Aaron, 

I'm sure you are right, I'm trying to bend Duplicity to my needs.  

"Alternatively, if you are just wanting a full backup that you can easily extract, is there a reason not to just do a compressed archive (e.g. a tar.gz)?" 

-is there a way to set Duplicity to create and upload to S3 a simple compressed archive like a tar.gz?   My apologies if I'm missing the obvious.  

My goal is to keep things simple to deal with in the event of a problem. For example I often travel and it would love it if I could use any available device to log in to my S3 account and get usable files with what's available on the computer at hand. I'd need just a browser and FTP to be back up and running. Could be Windows, Mac, Chromebook or even a smartphone.  

Jeff

On Wed, Jun 7, 2017 at 12:11 PM, Aaron <[hidden email]> wrote:

Hello Jeffrey,

On 2017-06-07 16:46, Jeffrey via Duplicity-talk wrote:

I'm using Duplicity to periodically load unencrypted full backups to my Amazon s3 account. Works great, except I'd love to be able to download those archives and extract them myself.  At the moment, when I download the  duplicity-full.************.diftar.gz I get another archive .difftar that will not properly extract.  
 
[...]
 
I'm backing up small websites, and I'm looking for an fast & easy way to restore them in a worse case scenario where the server blows up.  I can get a new VPS with wordpress setup in seconds then load up the site files using ftp.  

It is not directly answering your question, but it sounds as though you are fighting duplicity a bit instead of using it.

Any reason that you cannot backup to a different subfolder on Amazon for each website that you back up and then use duplicity (essentially the inverse of your backup command) to extract the files if you need them? You should not really be manually extracting archive files unless duplicity cannot do it for you. You can use duplicity to extract the archive files even if you have moved them (say off Amazon onto a folder on your local machine).

Alternatively, if you are just wanting a full backup that you can easily extract, is there a reason not to just do a compressed archive (e.g. a tar.gz)?

Kind regards,

Aaron


 





--

______________

Jeffrey Fongemie    



_______________________________________________
Duplicity-talk mailing list
[hidden email]
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
Reply | Threaded
Open this post in threaded view
|

Re: Question on dealing with Archives

duplicity-talk mailing list

Hello Jeffrey,

On 07/06/17 18:16, Jeffrey wrote:
Thanks Aaron, 

I'm sure you are right, I'm trying to bend Duplicity to my needs.  

Aren't we all!
"Alternatively, if you are just wanting a full backup that you can easily extract, is there a reason not to just do a compressed archive (e.g. a tar.gz)?" 

-is there a way to set Duplicity to create and upload to S3 a simple compressed archive like a tar.gz?   My apologies if I'm missing the obvious.  

My point was that if you are not using any of the incremental features or the ability to restore at various points in time etc, you do not need duplicity at all and could just create a tar.gz as normal and upload it. Tar can even handle --exclude statements etc.

There is not a way to make duplicity create tar.gz files -- all those extra files and diffs you do not want are how we can do the efficient incremental versions. Unless you are using duplicity for an additional reason that I am missing. Why do you think you need duplicity?

My goal is to keep things simple to deal with in the event of a problem. For example I often travel and it would love it if I could use any available device to log in to my S3 account and get usable files with what's available on the computer at hand. I'd need just a browser and FTP to be back up and running. Could be Windows, Mac, Chromebook or even a smartphone. 
As I say, if you only need one version and simplicity is key, I would just zip/tar.gz up the files and copy them onto Amazon or whatever.

Not trying to dissuade you from using duplicity, but something else may fit this use case better.

Kind regards,

Aaron

_______________________________________________
Duplicity-talk mailing list
[hidden email]
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
Reply | Threaded
Open this post in threaded view
|

Re: Question on dealing with Archives

duplicity-talk mailing list
Aaron, 

My first statement falls short of explaining everything I'm looking to do.  Both incremental, and occasional full backups.  Right now I've got Duplicity set (I think) to do daily incremental backups, which is great. And also a full backup every 14 days.  Before Duplicity I was using a RubyGem called Backup, which would do full backups only. The backups were not incremental and simple archives that could be extracted. But, full backups all the time are too much I think.  Duplicity seems like a nice compromise.  

My point was that if you are not using any of the incremental features or the ability to restore at various points in time etc, you do not need duplicity at all and could just create a tar.gz as normal and upload it. Tar can even handle --exclude statements etc.

My knowledge is weak when it comes to server admin so without something like Duplicity I don't know how to make that happen.   

I'll continue to experiment with this.  In the end I may just need to be more flexible or familiar with Duplicity.  

Jeff



On Wed, Jun 7, 2017 at 5:05 PM, Aaron <[hidden email]> wrote:

Hello Jeffrey,

On 07/06/17 18:16, Jeffrey wrote:
Thanks Aaron, 

I'm sure you are right, I'm trying to bend Duplicity to my needs.  

Aren't we all!
"Alternatively, if you are just wanting a full backup that you can easily extract, is there a reason not to just do a compressed archive (e.g. a tar.gz)?" 

-is there a way to set Duplicity to create and upload to S3 a simple compressed archive like a tar.gz?   My apologies if I'm missing the obvious.  

My point was that if you are not using any of the incremental features or the ability to restore at various points in time etc, you do not need duplicity at all and could just create a tar.gz as normal and upload it. Tar can even handle --exclude statements etc.

There is not a way to make duplicity create tar.gz files -- all those extra files and diffs you do not want are how we can do the efficient incremental versions. Unless you are using duplicity for an additional reason that I am missing. Why do you think you need duplicity?

My goal is to keep things simple to deal with in the event of a problem. For example I often travel and it would love it if I could use any available device to log in to my S3 account and get usable files with what's available on the computer at hand. I'd need just a browser and FTP to be back up and running. Could be Windows, Mac, Chromebook or even a smartphone. 
As I say, if you only need one version and simplicity is key, I would just zip/tar.gz up the files and copy them onto Amazon or whatever.

Not trying to dissuade you from using duplicity, but something else may fit this use case better.

Kind regards,

Aaron



--

______________

Jeffrey Fongemie    



_______________________________________________
Duplicity-talk mailing list
[hidden email]
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
Reply | Threaded
Open this post in threaded view
|

Re: Question on dealing with Archives

duplicity-talk mailing list

Hello Jeffrey,

 

On 2017-06-07 23:45, Jeffrey wrote:

Aaron, 
 
My first statement falls short of explaining everything I'm looking to do.  Both incremental, and occasional full backups.  Right now I've got Duplicity set (I think) to do daily incremental backups, which is great. And also a full backup every 14 days.  [...]
 
My knowledge is weak when it comes to server admin so without something like Duplicity I don't know how to make that happen.  
 
 
Okay, so breaking this down you essentially have a couple of conceptual options:
1) Use a tool like duplicity to do backups and restores. If you backup various websites to different target folders (full or incremental) that should work well. If you need to restore, you should do that through duplicity and it should all work well. You can check out the manual or ask if you are struggling to make a restore work, but you really should test it out before you need it. The problem with this for your use case is that if you are wanting to restore to a variety of platforms/devices etc while you are travelling, you likely will not have duplicity installed and will struggle to restore what you want -- even if we did offer a command to generate a tar.gz from a particular restore point, you would not have duplicity installed on your phone to create that tar.gz.
 
2) Use a standard tool to just compress the files into a format that you can extract on all the devices you need support for, essentially zipping them up (in Windows terminology) and then put those files somewhere you can access them (say Amazon, or Dropbox/Google Cloud etc). Do not worry about being new to server admin, we all started somewhere. Something like askubuntu is a good place to ask about these more generic tasks.
 
You are really going to struggle to do both with the same archives. If you are doing 2, it cannot really do the incremental backup etc. If you are doing 1, it is much easier to use the same tool you used for backup to do the restore. You could easily do both separately if you are happy to use twice the storage: use duplicity as your primary backup system with all the incremental versions available; and directly compressing up files to archives you can open easily on a regular basis for those emergency "access from a phone in distant country" moments.

Kind regards,

Aaron


_______________________________________________
Duplicity-talk mailing list
[hidden email]
https://lists.nongnu.org/mailman/listinfo/duplicity-talk

blocked.gif (164 bytes) Download Attachment