Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


Life after CrashPlan: why they sucked, why Backblaze & B2 suck, Glacier vs Others - Page 6
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

Life after CrashPlan: why they sucked, why Backblaze & B2 suck, Glacier vs Others

12346»

Comments

  • @WSS said:
    Your data won't really matter, since the planet will be gone before a restoration could complete, anyhow.

    I hope my data lives on so that some mutant species will enjoy my documentaries.

  • @boxelder said:

    @WSS said:
    Your data won't really matter, since the planet will be gone before a restoration could complete, anyhow.

    I hope my data lives on so that some mutant species will enjoy my documentaries.

    You don't to wait for the end of the world to post on YouTube.

  • rm_rm_ IPv6 Advocate, Veteran

    boxelder said: I use a private torrent tracker for getting these documentaries. If you build up good ratio at a tracker, then that's basically free re-download credit in the event of a local drive failure. You just need to cron an ls > movies.txtto keep a text file of all the titles you've downloaded and print that out if you need a backup.

    That's a good idea, the only thing worse than losing data, is losing data and not remembering what you even had there, and lost. Just a suggestion to replace ls with find, it will work recursively.

  • snickerpopsnickerpop Member
    edited November 2017

    @peixotorms said:
    @raindog308 I think you may want to plan your directory structure a bit more.
    For example, an archive where you put stuff that doesn't change often (and keep it in sync with rclone) and another delta archive, for changing or recent stuff.

    • Stick to a standard name, such as YYYYMMDD/directoryname/YYYYMMDD_backup.zip

    • Don't rename the archive directory and contents, just add / delete new stuff daily from delta.

    • Have a different bucket for different stuff... don't put everything inside the same and rclone specific, smaller directories (it's faster to start if there are less files).

    • Set a time limit on your bucket, so you don't pay and accumulate Tb's of data from "2 years ago" in the future.

    • Use at least 16 or 32 threads.

    • Use copy, instead of sync, when possible... this is what I use:

    /usr/bin/rclone copy -v --stats=15s --transfers 32 /home/backups/recovery myremotename:bucketnamerecovery

    /usr/bin/rclone sync -v --stats=15s --transfers 64 /home/backups/delta myremotename:bucketnamedelta

    I would use the log file command too. If your syncing a bunch of files it will be much easier to work with than a command line.
    --log-file=FILE location

  • Y'all are a bunch of hoarders.

    My kind of people.

  • I learned so much in this thread. Cheers!

Sign In or Register to comment.