Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


Critical Data Backup
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

Critical Data Backup

I use Synology NAS locally to back up my critical data, using RAID 1, 4 hard drives, 3TB each (2 Western Digital, 2 Seagate), for 5 years now, and have had one hard drive go bad and have successfully rebuilt it. I also purchased wasabi.com (2 years ago) and servarica (half a month ago) storage space for backup redundancy, but never relied on them to restore, my data volume of about 450GB, and to grow by about 20GB per year, I have been assured that this is foolproof, or should continue to use cold storage backup? For this part of the data, the cost is not sensitive.

«13

Comments

  • Daniel15Daniel15 Veteran
    edited December 2021

    Some people follow the 3-2-1 strategy for critical data: You should have at least three copies of the data: One main copy and two backups. Two should be onsite on different mediums (eg. portable HDDs, Blu-Rays, whatever), and one should be off-site. The off-site backup could be "in the cloud", on portable HDDs in a safe deposit box at a bank, whatever, as long as it's in a different location to the primary backups.

    The reason for one of the backup copies being onsite is that it's usually faster and cheaper to restore from.

    IMO it's also fine to have one copy locally and two copies in the cloud at different providers (eg. one at Servarica, and one at HostHatch, Backblaze, Wasabi, etc) as long as one of them has cheap restores (ie. if it's a VPS, the monthly bandwidth is sufficient in case you need to download the entire backup) and your internet is fast enough to make that feasible (you won't be waiting days to download the entire backup)

  • deankdeank Member, Troll
    edited December 2021

    Backup...?
    On LET?

    Some people here can't even be arsed to backup 1mb of critical data, the very concept of backing up seems totally alien to them.

  • @Daniel15 said:
    Some people follow the 3-2-1 strategy for critical data: You should have at least three copies of the data: One main copy and two backups. Two should be onsite on different mediums (eg. portable HDDs, Blu-Rays, whatever), and one should be off-site. The off-site backup could be "in the cloud", on portable HDDs in a safe deposit box at a bank, whatever, as long as it's in a different location to the primary backups.

    The reason for one of the backup copies being onsite is that it's usually faster and cheaper to restore from.

    IMO it's also fine to have one copy locally and two copies in the cloud at different providers (eg. one at Servarica, and one at HostHatch, Backblaze, Wasabi, etc) as long as one of them has cheap restores (ie. if it's a VPS, the monthly bandwidth is sufficient in case you need to download the entire backup) and your internet is fast enough to make that feasible (you won't be waiting days to download the entire backup)

    I use 4 hard disks for RAID1, unless all 4 hard disks are damaged, even if 3 are damaged, there will be no problem, so are the members in RAID1 also belong to one of the backups?

  • Daniel15Daniel15 Veteran
    edited December 2021

    @mcgree said: I use 4 hard disks for RAID1, unless all 4 hard disks are damaged, even if 3 are damaged, there will be no problem, so are the members in RAID1 also belong to one of the backups?

    https://www.raidisnotabackup.com/

    RAID isn't going to save you if your NAS gets destroyed by a fire or tornado or something like that :smile:

    Data corruption or accidental deletion is also an issue where you'll need backups. RAID will just mirror the corrupted data; you'll need some sort of backup to restore the good data before it was corrupted or deleted. That's where an onsite backup is useful.

  • yoursunnyyoursunny Member, IPv6 Advocate

    @Daniel15 said:
    Data corruption or accidental deletion is also an issue where you'll need backups. RAID will just mirror the corrupted data; you'll need some sort of backup to restore the good data before it was corrupted or deleted. That's where an onsite backup is useful.

    I have daily rclone sync to HostHatch 250GB.
    If data is corrupted, rclone would faithfully synchronize the corrupted files to the backup.
    Oops.

    I manually copy files to USB HDD.
    This is done maybe 4~8 times per year.
    Nevertheless, there's still no guarantee that what I copied isn't corrupted.

    The data is family photos.
    In the worst case, I can get them back from Facebook Albums, but it's lower image quality.

    As for the push-up videos, I don't worry much, because I can always redo the push-ups.
    Family photos are different because grandma is dead so I cannot retake the photos.

    Thanked by 1farsighter
  • Daniel15Daniel15 Veteran
    edited December 2021

    @yoursunny said: If data is corrupted, rclone would faithfully synchronize the corrupted files to the backup.
    Oops.

    On Linux systems I use Borgbackup since it lets me keep months of backups relatively cheaply (as it dedupes blocks) and I can go back to an uncorrupted file. I still haven't quite figured out what to use for backups on my Windows systems. My approach right now is to throw all the important files into Seafile, which is hosted on a HostHatch VPS, which is then backed up to a Servarica VPS.

  • @deank said:
    Backup...?
    On LET?

    Some people here can't even be arsed to backup 1mb of critical data, the very concept of backing up seems totally alien to them.

    LOL I haven't seen a "My data got [corrupted | lost | eaten by a grue] and the provider of my $10/year VPS didn't have backups of it!! I'm losing MILLIONS! I'm going to sue them" posts in a while.

  • @Daniel15 said:
    Some people follow the 3-2-1 strategy for critical data: You should have at least three copies of the data: One main copy and two backups. Two should be onsite on different mediums (eg. portable HDDs, Blu-Rays, whatever), and one should be off-site. The off-site backup could be "in the cloud", on portable HDDs in a safe deposit box at a bank, whatever, as long as it's in a different location to the primary backups.

    The reason for one of the backup copies being onsite is that it's usually faster and cheaper to restore from.

    IMO it's also fine to have one copy locally and two copies in the cloud at different providers (eg. one at Servarica, and one at HostHatch, Backblaze, Wasabi, etc) as long as one of them has cheap restores (ie. if it's a VPS, the monthly bandwidth is sufficient in case you need to download the entire backup) and your internet is fast enough to make that feasible (you won't be waiting days to download the entire backup)

    There really should be a Backup 3-2-1 sticky post for LET. Would probably get more sales for advertisers.

  • @mcgree said:

    @Daniel15 said:
    Some people follow the 3-2-1 strategy for critical data: You should have at least three copies of the data: One main copy and two backups. Two should be onsite on different mediums (eg. portable HDDs, Blu-Rays, whatever), and one should be off-site. The off-site backup could be "in the cloud", on portable HDDs in a safe deposit box at a bank, whatever, as long as it's in a different location to the primary backups.

    The reason for one of the backup copies being onsite is that it's usually faster and cheaper to restore from.

    IMO it's also fine to have one copy locally and two copies in the cloud at different providers (eg. one at Servarica, and one at HostHatch, Backblaze, Wasabi, etc) as long as one of them has cheap restores (ie. if it's a VPS, the monthly bandwidth is sufficient in case you need to download the entire backup) and your internet is fast enough to make that feasible (you won't be waiting days to download the entire backup)

    I use 4 hard disks for RAID1, unless all 4 hard disks are damaged, even if 3 are damaged, there will be no problem, so are the members in RAID1 also belong to one of the backups?

    You'll more likely lose them from accidental user error or the NAS main board dying. Not sure how hard to move drives to current/similar NAS and have it import successfully.

    For 500GB, cloud backup for additional redundancy.

  • @Daniel15 said:

    @mcgree said: I use 4 hard disks for RAID1, unless all 4 hard disks are damaged, even if 3 are damaged, there will be no problem, so are the members in RAID1 also belong to one of the backups?

    https://www.raidisnotabackup.com/

    RAID isn't going to save you if your NAS gets destroyed by a fire or tornado or something like that :smile:

    By that argument, there's no such thing as backups. >:)

    Data corruption or accidental deletion is also an issue where you'll need backups. RAID will just mirror the corrupted data; you'll need some sort of backup to restore the good data before it was corrupted or deleted. That's where an onsite backup is useful.

    That's more likely.

    Nowadays, with malware cocksuckers running wild, one should strive for Veeam hardened repository.

  • TimboJonesTimboJones Member
    edited December 2021

    @yoursunny said:
    Family photos are different because grandma is dead so I cannot retake the photos.

    Not impossible, just a challenge. But she ain't smiling. :#

    Thanked by 1dosai
  • yoursunnyyoursunny Member, IPv6 Advocate

    @TimboJones said:

    @yoursunny said:
    Family photos are different because grandma is dead so I cannot retake the photos.

    Not impossible, just a challenge. :#

    Resurrect my grandma for 3 days, and I'll give you my VirMach $8.88.

    Thanked by 1Erisa
  • @Daniel15 said:

    @yoursunny said: If data is corrupted, rclone would faithfully synchronize the corrupted files to the backup.
    Oops.

    On Linux systems I use Borgbackup since it lets me keep months of backups relatively cheaply (as it dedupes blocks) and I can go back to an uncorrupted file. I still haven't quite figured out what to use for backups on my Windows systems. My approach right now is to throw all the important files into Seafile, which is hosted on a HostHatch VPS, which is then backed up to a Servarica VPS.

    Veeam agent. Free. Incremental updates. Restore to different hardware, etc.

  • @yoursunny said:

    @TimboJones said:

    @yoursunny said:
    Family photos are different because grandma is dead so I cannot retake the photos.

    Not impossible, just a challenge. :#

    Resurrect my grandma for 3 days, and I'll give you my VirMach $8.88.

    A shovel and location needed. Unless she was cremated, then that's a job for @Jesus. Luckily, he's a member here.

  • @TimboJones said:

    @Daniel15 said:

    @mcgree said: I use 4 hard disks for RAID1, unless all 4 hard disks are damaged, even if 3 are damaged, there will be no problem, so are the members in RAID1 also belong to one of the backups?

    https://www.raidisnotabackup.com/

    RAID isn't going to save you if your NAS gets destroyed by a fire or tornado or something like that :smile:

    By that argument, there's no such thing as backups. >:)

    Data corruption or accidental deletion is also an issue where you'll need backups. RAID will just mirror the corrupted data; you'll need some sort of backup to restore the good data before it was corrupted or deleted. That's where an onsite backup is useful.

    That's more likely.

    Nowadays, with malware cocksuckers running wild, one should strive for Veeam hardened repository.

    Borgbackup supports an "append-only" mode which disallows clients from deleting data from the backup repo. It means that even if an attacker gets the SSH key and tries to delete all the data from the backup, it's just marked as deleted, but isn't actually deleted from the repo. Pretty easy to recover the backups.

  • @TimboJones said:

    @Daniel15 said:

    @yoursunny said: If data is corrupted, rclone would faithfully synchronize the corrupted files to the backup.
    Oops.

    On Linux systems I use Borgbackup since it lets me keep months of backups relatively cheaply (as it dedupes blocks) and I can go back to an uncorrupted file. I still haven't quite figured out what to use for backups on my Windows systems. My approach right now is to throw all the important files into Seafile, which is hosted on a HostHatch VPS, which is then backed up to a Servarica VPS.

    Veeam agent. Free. Incremental updates. Restore to different hardware, etc.

    I'd prefer sticking with an open source app :)

    Thanked by 2yoursunny drunkendog
  • i am using two synology NAS and hosthatch 2tb vps to store my data. one NAS place in my home with raid5,another in my friends' home in different city.i think this plan is very reliable

  • TimboJonesTimboJones Member
    edited December 2021

    @Daniel15 said:

    @TimboJones said:

    @Daniel15 said:

    @yoursunny said: If data is corrupted, rclone would faithfully synchronize the corrupted files to the backup.
    Oops.

    On Linux systems I use Borgbackup since it lets me keep months of backups relatively cheaply (as it dedupes blocks) and I can go back to an uncorrupted file. I still haven't quite figured out what to use for backups on my Windows systems. My approach right now is to throw all the important files into Seafile, which is hosted on a HostHatch VPS, which is then backed up to a Servarica VPS.

    Veeam agent. Free. Incremental updates. Restore to different hardware, etc.

    I'd prefer sticking with an open source app :)

    Like you're going to review the code and modify it? Is the concern they'll disappear or they'll do something malicious?

    They'll do way more QA than any open source project. I've given up on several open source projects due to bugs and design decisions. Many miss basic features like copying/restoring to smaller drive/partition.

    So maybe UrBackup for your windows machines? Never used myself.

    Edit: I prefer image backups more than data only backups, especially for Windows.

  • @tykgood said:
    i am using two synology NAS and hosthatch 2tb vps to store my data. one NAS place in my home with raid5,another in my friends' home in different city.i think this plan is very reliable

    As long as malware can't fuck up all three locations, you're good.

  • yoursunnyyoursunny Member, IPv6 Advocate

    @TimboJones said:

    @yoursunny said:

    @TimboJones said:

    @yoursunny said:
    Family photos are different because grandma is dead so I cannot retake the photos.

    Not impossible, just a challenge. :#

    Resurrect my grandma for 3 days, and I'll give you my VirMach $8.88.

    A shovel and location needed. Unless she was cremated, then that's a job for @Jesus. Luckily, he's a member here.

    One grandma was cremated.
    The other grandma was donated to medical research, and there's only an engraved name.
    Which one can you bring back?

  • @TimboJones said: Like you're going to review the code and modify it?

    Yeah I sometimes add extra features if I need them :)

    I used to use Duplicati on a Windows server and it seemed to work well, albeit not as well as Borg.

    It doesn't have to be free of charge... Happy to pay for something that works well. There's a common misconception that open source software has to be free of charge.

  • @yoursunny said:

    @TimboJones said:

    @yoursunny said:

    @TimboJones said:

    @yoursunny said:
    Family photos are different because grandma is dead so I cannot retake the photos.

    Not impossible, just a challenge. :#

    Resurrect my grandma for 3 days, and I'll give you my VirMach $8.88.

    A shovel and location needed. Unless she was cremated, then that's a job for @Jesus. Luckily, he's a member here.

    One grandma was cremated.
    The other grandma was donated to medical research, and there's only an engraved name.
    Which one can you bring back?

    You should probably develop the photos and keep three copies in separate locations. I think developing photos and physical photo albums are still a thing.

  • @Daniel15 said:

    @TimboJones said: Like you're going to review the code and modify it?

    Yeah I sometimes add extra features if I need them :)

    Or use a solution that already includes those features. Any specific ones or just want the option in case the need arises?

    I used to use Duplicati on a Windows server and it seemed to work well, albeit not as well as Borg.

    It doesn't have to be free of charge... Happy to pay for something that works well. There's a common misconception that open source software has to be free of charge.

    Oh, I know you'd pay. I'm just not aware of any that isn't just data backup and can do image backups as well while running. If you just need data backups, then there's probably lots of options that can do compression and incremental versioning. Built in ReFS might even be something to look into.

  • @TimboJones said:

    @Daniel15 said:

    @yoursunny said: If data is corrupted, rclone would faithfully synchronize the corrupted files to the backup.
    Oops.

    On Linux systems I use Borgbackup since it lets me keep months of backups relatively cheaply (as it dedupes blocks) and I can go back to an uncorrupted file. I still haven't quite figured out what to use for backups on my Windows systems. My approach right now is to throw all the important files into Seafile, which is hosted on a HostHatch VPS, which is then backed up to a Servarica VPS.

    Veeam agent. Free. Incremental updates. Restore to different hardware, etc.

    I like Veeam Agent because it's easy to use (disclosure: I've never tried to restore a backup from it), but the biggest downside I run into is that you can only run one job with the free version of Veeam Agent. I think the free version of Veeam Backup & Replication lets you have up to 10 jobs, but it appears to use more resources and focused on backing up VMs. I'm only backing up physical computers, not VMs.

    Since my files aren't very well organized and Agent only lets me set up one backup job, I have to do a complete drive backup. This requires much more space than would be needed if I could exclude certain folders (e.g. those containing media files, temporary files, etc.), which also limits the number of copies of previous backups I can keep.

  • @user123 said:

    I like Veeam Agent because it's easy to use (disclosure: I've never tried to restore a backup from it), but the biggest downside I run into is that you can only run one job with the free version of Veeam Agent. I think the free version of Veeam Backup & Replication lets you have up to 10 jobs, but it appears to use more resources and focused on backing up VMs. I'm only backing up physical computers, not VMs.

    Veeam Backup & Replication can back up both physical as well as virtual systems. The community (free) version is also not limited to 10 jobs, but to 10 hosts, which I think is pretty generous. You can set up as many backup jobs as you want -- as well as backup copy jobs, which serve to replicate your backups in several locations.

    Veeam does require a Windows system to run the VBR server component and its GUI, which I see as a weakness. But you can use a VPS for that if you like. The VBR then allows you to deploy the agent software on your target systems, and it supports many Linux flavors as well as Macs and, of course, Windows. The thing I really like is that you can set up repositories (targets for the backups) in a variety of different ways using your own storage resources, whether it's local or a remote VPS. The backup sources then communicate directly with the repositories, all directed by the VBR component.

    So, Veeam is certainly worth looking into, even if they do specialize more in protecting virtualized environments (VMWare and Hyper-V).

    Thanked by 1user123
  • @Daniel15 said:
    I'd prefer sticking with an open source app :)

    I understand your reasoning here, but I lean towards a hybrid approach. I use a mix of open source and closed source commercial software. I figure it's unlikely that an issue with one would affect the other, so I've got a better chance of having good backups if I'm using different technologies to perform them.

    And of course the commercial software offers some level of support, which can be nice to have when dealing with business data :smile:

  • @aj_potc said:

    @user123 said:

    I like Veeam Agent because it's easy to use (disclosure: I've never tried to restore a backup from it), but the biggest downside I run into is that you can only run one job with the free version of Veeam Agent. I think the free version of Veeam Backup & Replication lets you have up to 10 jobs, but it appears to use more resources and focused on backing up VMs. I'm only backing up physical computers, not VMs.

    Veeam Backup & Replication can back up both physical as well as virtual systems. The community (free) version is also not limited to 10 jobs, but to 10 hosts, which I think is pretty generous. You can set up as many backup jobs as you want -- as well as backup copy jobs, which serve to replicate your backups in several locations.

    Veeam does require a Windows system to run the VBR server component and its GUI, which I see as a weakness. But you can use a VPS for that if you like. The VBR then allows you to deploy the agent software on your target systems, and it supports many Linux flavors as well as Macs and, of course, Windows. The thing I really like is that you can set up repositories (targets for the backups) in a variety of different ways using your own storage resources, whether it's local or a remote VPS. The backup sources then communicate directly with the repositories, all directed by the VBR component.

    So, Veeam is certainly worth looking into, even if they do specialize more in protecting virtualized environments (VMWare and Hyper-V).

    Thanks for clarifying the B&R limitation. How do you back up the VBR server (configuration) and can you have multiple systems running the VBR server as a failover in case the primary one goes offline or dies?

    Going back to my point about Veeam Agent's limitations, it seems like a very strange decision to allow individual workstations to support multiple backup jobs ONLY when configured to do so by an external (VBR) server and only one backup job if configured locally without an external server.

  • @user123 said:
    Thanks for clarifying the B&R limitation. How do you back up the VBR server (configuration) and can you have multiple systems running the VBR server as a failover in case the primary one goes offline or dies?

    The VBR configuration can be backed up to one (and just one) of the repositories you've set up. But the Veeam backup files themselves don't require the VBR config in order to restore them. They contain all the info they need internally, so if you lose the VBR server, it's not a huge deal. You can reinstall it anywhere, point it to your existing repositories, and it'll scan them and pick up your backups.

    Going back to my point about Veeam Agent's limitations, it seems like a very strange decision to allow individual workstations to support multiple backup jobs ONLY when configured to do so by an external (VBR) server and only one backup job if configured locally without an external server.

    I'm not sure about Veeam's reasoning behind that. Perhaps the agent alone doesn't have enough logic to handle all of the situations that might arise by supporting multiple (and potentially concurrent) jobs.

    The VBR server is a command and control component, so it manages all of the jobs and repositories, logging, and notifications in case something goes wrong. Although it's a bit heavy size-wise, it's good to have it. It also unlocks a good number of other features that you don't get with the agent alone.

    Thanked by 1user123
  • @user123 said:
    ... can you have multiple systems running the VBR server as a failover in case the primary one goes offline or dies?

    Forgot to answer this. No, you're not supposed to have multiple VBR servers talking to the same agents. It may work, but it's definitely not supported.

    If the VBR server itself fails, you can reinstall it and recover the configuration. Or, like I mentioned, just point it at the backup files, which contain everything needed to start recovering data (assuming that's your priority).

    Oh, and if I didn't make it clear, the VBR server doesn't have to be a backup repository itself. It can serve that role, but I wouldn't recommend it.

    Thanked by 1user123
  • Shot2Shot2 Member
    edited December 2021

    For home use, about 500GB of infrequently updated critical data: one on-site backup (scheduled, block-based checksummed disk replication: a kind of asynchronous RAID if you prefer). Convenient for quickly resuming work if the main disk dies: just plug the replica and keep working while waiting for some replacement hardware.

    Plus one "on-demand" (usually weekly) backup to a distant storage server (far away + encrypted + versioned thanks to nilfs2 goodness, da time machine in da cloud). You can bomb my home and my whole country, photos of my fluffy kitty will survive me HAHAHA.

    Thanked by 1TODO
Sign In or Register to comment.