New on LowEndTalk? Please Register and read our Community Rules.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
Backup questions
If you use rsync for backup, and you get hacked or a virus deletes/corrupts all of your files, won't the same thing happen on the backup box because it syncs everything?
Is it better to set up cron jobs which tar the /var/www/ directory and a MySQL dump, then copy them over to another server which has a cron job? And, if using the last option, wouldn't a hacker get access to your backup server IP and SCP password/auth key, and if so is it better for the backup server to download from the main server instead of the other way around?
Lastly, what apps are there out there to set this up, since I don't know if I trust my own scripts/cronjobs.
Thanks
Comments
I keep live backups (i.e. backups copied to another server on the internet on a frequent basis) and also download a copy to store on an external hard drive at my home which remains disconnected from the PC unless in the process of downloading a backup.
You don't push backups, you pull backups. Pushing is vulnerable.
Use something like rsnapshot.
If you rsync you only keep the last backup. A far better method is to use something like Duplicity to create diff snapshots.
Check the other side of your logic. If a hacker gets access to your backup server, don't they have access to all the keys for every server? Having the backup server perform backups reduces attack surface, but increases potential loss. A better solution is to create dedicated credentials for each backup client (e.g each server gets different S3 API keys, etc.).
Are they still in development? Last offical Ubuntu release posted on their site was for Jaunty 9.04. The last git commits were from 2013.
There is nothing to develop really. It's pretty stable.
Thanks guys, I think rsnapshot is what I'm looking for
+1 for rsnapshot -- I've begun using the tool recently and it works great.
duply?
I use a custom script which makes a tar.gz file of selected folders. Then the script copies the files to my backupserver. Futhermore, I use chrooted envoirenments and for every server which gets backuped another user. So if everything goes wrong, the hacker could delete my backups. But I sync them with a another backupserver so I always have 3 copies of my files (liveserver, backup1 and backup2).
hey, I need a bit of help with this. I just created my own shell script which works (it copies /var/www/ and an SQL dump into a 7zip file), but I want to make it so the user "x" (example) can only view files in
/home/x/backups
and do nothing else. then I want to use scp from another server to copy the backup files over to the backup server.how can I lock down the user "x" so he can do nothing except read that backup folder?
thanks
edit: I figured out a way, but it only works with SFTP and not SCP. know any good SFTP servers which require very little knowledge to set up?
@hostnoob I use. OpenSSh for this, the feature is called chrooted envoirenment
that's what I'm using but it says it only works with an SFTP server
SFTP is an extension of SSH. If you're using SSH version 2, you should already have SFTP capabilities.
Edit: upon connection, SFTP asks for a valid user and password (/etc/passwd). It will enter the user's homedir. On upload, the files will be "impregnated" with user's access level.
Thanks, I've got it working now. I had to set the owner of /home/x/ to root:root