New on LowEndTalk? Please Register and read our Community Rules.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
Comments
isn't that good enough... or are you looking for some ui ?
ahsay if the data changes, otherwise a copy of the config files.
Haha no, just wondering what other people are using here.
compressed .tar.gz, one copy on the server and another elsewhere.
speedy restores especially for cPanel accounts migration.
is this good enough? open to suggestions....
For our cPanel based sites, we're using s3 Backups.
Well for directadmin server its an easy two click to backup all user and restore. Well for interworx its a 24 hour nightmare (example: export-import 300 user).
So only r1soft help... which is fast. Please kill me before i want again an export-import at interworx side. Never...Never...
For now we use onapp backup and backup our main database to strongspace sftp service for our main website.
Thats all for now
i use git for my webserver. what i like about it is... for example when i got infected, i can go back to different dates and check what changed.
I am using R1soft currently but it's getting costly due to number of servers. Currently looking for an alternative.
Let me know if you find any good options (that has a nice Gui with it as well)
Acronis
I backup my servers on my Kidechire servers and their ftps... I use virtualmin built-in backup function...
For servers that don't have virtualmin or any panel installed, I use a bash script that I wrote that automatically backup the files and DBs..
All my backups are offline backups, I don't have anything critical for real time backup.
I also use S3 storage for clients websites.
Simple tar the entire box and gpg it:
https://paster.li/?489e633b7ff3654f#BfXUgcyogUt966UQYSe0tvphOh4GLHNApyC4RKggfMU=
Manual, sftp FileZilla
Nothing
rdiff-backup for the OS except MySQL, and the usual mysqldump for MySQL.
I'm a big fan of dumping databases, tarring things up, and rsyncing them to the backup server. Though with the latest MXroute server I'm using cpanel's backup system to send to VPSDime and S3, but I mean...that's just because it's there and easy
Uploading all to S3 and replicate around the world to 5 other backup servers we have.
+1 for adding the
--numeric-owner
switch to "tar" command.I've found that if I don't use this switch when creating the archive, later when restoring it from a different system (e.g. when my provider restores my openvz archive from his host node), the file/directory owners are messed up for the extracted files.
Most things that I need are easy to install pieces of software, I create a bash script very often to auto install the things that I want (+dependencies and configuration if needed). For the more normal things I use DropBox and hubiC (I might use GDrive if I really want it) for the time being. I just save some sqlite DB's on there (in compressed format) and that's pretty much it. For the more critical things I use 2 LEBs (2 different providers, parent companies, DCs) + previous method.
i use attic. i like the deduplication feature.
rsnapshot?
what?
rsynce works for me.
He's using a software named nothing
I highly advice @cassa to use a backup method if the server is active production one. Always ready for the worst scenario
rsync and encrypted to death without insecure ciphers and with EncFS
Wonderful for filesystem backups, it's a great and reliable tool.
Essentially tar with increment option, rsync/lftp, and mysql dump, along with mutt based notifications, and cron. It seemed simple, but now runs into 1500 lines for the bash script, and with two conf files.
I didnt want to adapt to another solution.
I'm still using Duplicity with a pre-backup MySQL/PostgreSQL dump script to another storage VPS via SSH. (https://raymii.org/s/tutorials/Website-and-database-backup-with-Duplicity.html). The storage VPS itself gets backed up to an Openstack Cloud Object Store (https://raymii.org/s/tutorials/Encrypted_Duplicity_Backups_to_Openstack_Swift_Objectstore.html) - fully encrypted.
Recently I've written a Duplicity wrapper for one of my clients: http://cdn.duplicity.so/README.html
daily cron job to 7zip the dirs I want to backup (including a fresh sql dump)
then I write the file to a folder which is only accessible by a certain account, and then I pull the files (using that account with SSH keys) from a backup server.
That's why I'm following this topic ;-)
Well, I just started with my backup setup
Just set up BackupPC, pretty easy tool. Probably going to mirror the backup server some times.