New 'feature': distributed backups (with a little help from my friends)

Currently JGO solely relies on Linode (VPS disk & backup-service) and my crappy PC for backups. According to Murphy’s law, this will one day prove insufficient. I feel like I put all your eggs in one basket. We need backups to be distributed among at least three varied locations, to be prepared and able to recover JGO in case the Linode datacenter in London and my home town get nuked accidently.

My humble request to you, as beacon of light in troubled times, is to occasionally (preferably periodically) fetch these encrypted files from this public directory:

There will eventually be 14 files in this directory, totalling about 1.4 GB, as two files (homedir tar, mysqldump) are created/overwritten each day of the week. These files are encrypted, as they contain sensitive data, like PMs and your emailaddresses. In case of a slight disaster I will generously accept any backups and will decrypt the files myself, and/or make the passphrase available through a dead man’s switch - as I’d potentially be nuked myself too.

Do you know of any software that can pull this data for me once per day? I have a small server in my house (nas) that we use for media and it has plenty of space to spare.

cronjob :point:

Derp .

1.4GB isn’t really that much, why not just set the files to automatically sync to a number of free secure cloud service providers, Dropbox, Mega, Google Drive, Spideroak, etc, then once you are nuked, your dead man’s switch could send out instructions on how to recover such files.

Yeah, I don’t know about other services, but I use Microsoft’s OneDrive, and was automatically given 15GB free. Seems like this would be a good solution.

Because if something goes wrong locally (say, the files are deleted) all carefully synced copies are deleted too. Anyway, it’s always better not to script/automate these kinds of things from one place, as it creates the risk of destroying your data from the very same place in one big sweep, too.

https://www.roiatalla.com/public/jgo-recovery/

Cronjob setup daily!

EDIT: Dropbox has saved my ass a lot with accidentally deleting files: it has file history of up to 3 months for free accounts! I recommend you use that as well.

EDIT 2: I have finally received a medal from Riven… my life is complete!

This is my cronjob btw, how can I improve it?


wget -P /my/backup/folder/ -N -erobots=off -r -np -nd -A "*.gpg" http://www.java-gaming.org/recovery/

  • -P (workdir)
  • -erobots=off (prevent robots.txt to be downloaded/used)
  • -N (only download if files (timestamps) have increased, also overwrites old files, instead of adding *.1, *.2)
  • -r (recursive / trace)
  • -np (don’t trace into parent)
  • -nd (don’t create directory hierarchy)
  • -A (only accept/store files with this pattern)

Prepare to be recycled.

I was looking for that stupid -nd option, the man pages are too long! Thanks.

Yer I can help out , ill do this once a week or so manually not too much effort.

Converted the commands in previous post into a one-liner.

I also had set up backups for JGO. http://goharsha.com/jgo_recovery/

These backups are generated every day at 11:00 PM IST (Indian Standard Time)

Aha, I was looking for a way to set robots=off without using the wgetrc file and I didn’t know about -N, thanks!

Are the ‘13h’ files intentional by the way?