New on LowEndTalk? Please Register and read our Community Rules.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
Comments
I hope you can make it work to backup kloxo too.
Brilliant! I look forward to it, will mean can hopefully get rid of Bacula.
. What Bacula features did you like?
Did u make it work ?
Please see: https://github.com/PetaByet/cdp/issues/7
Pardon if this is a dumb question but does this need root access to the target server or what exactly?
It does not. It only needs a user that has sufficient previleges to read the files / create the tar archive.
Unless you're backing up your grandma's photos from /home/grandma, yes, your backup server must have root access to every node.
So, if the backup server gets compromised, good luck. They will also have root access to every one of your nodes.
This is the reason I never use pulling backups.
Still better than nodes pushing their backups.
As CDP wants to be installed on a fresh system, it should also harden it.
Also, it would be nice if the backups could not be decrypted on the server. Encrypt them with a public key of mine, and I'll download them and decrypt locally.
But I like it so far. Thanks.
Why? From a security perspective pushing full encrypted backups* is the best option.
If you're doing diff backups, it's another topic. Diff backups and pushing don't go together. But I've yet to think of a good way to do encrypted diff backups (without the master having root access to every node) yet. If anyone has a good idea, let me know.
* Full as in the node tar's relevant directories, encrypts the archive and pushes the backup to the master backup server.
What about GnuPG or RSA?
@PetaByet
I found a glitch (not much to classify as bug).
Check marked points in attached screenshot, still have to add password after choosing no encryption.
Sorry for behaving like a kid, but it's throwing some critical errors :
Check this one :
Faced this when i tried to backup one of my server, also tried to chown apache:apache to that directory & files where it's giving error but it's still same. Any workaround ?
You did a great work but it need some time to get it stable Best of luck with it.
I got the same yesterday when I tried CDP out. Would like to know how to fix this or why it's happening.
It's not a bug, you haven't configured the permissions on your server correctly.
See page 2 in this thread:
http://www.lowendtalk.com/discussion/comment/785943/#Comment_785943
Fixed.
It's easier if you just run it via the command line or using cron job as the system is designed for it.
I was thinking about GnuPG, yes.
how to back up a whole dedicated server? I found it very annoying
any of you have test and seeing that load is high rise or not? i will using this script in cpanel server if not rise too high.
btw, nice script!
give directory /
It just tar's the directory(s) and transfer the compressed archive to the backup server, so if your server has a good CPU and internet connection, this script shouldn't cause a lot (if any) disruption.
Hey @Petabyet doing some great work on this, any chance for some sort of partition backup or Block level? so i can take the whole /home or whole /vz etc
Using
dd
I'm assuming?hello,
This is a very good script.
I would like the same script, but with multiple users.
Did you plan this option?
This script already supports multiple user accounts as well as ACL's .
Thank you for the reply,
But I would manage quotas.
For example. propose that a person 10 GB of space and another 20 GB.
What to do?
thank you
Why would you want that?
CDP as a Service
@PetaByet Does CDP provide a way to see the files inside backups? Can I restore backups in a per file/dir way? Ex.: I want to see if /etc/any.conf exists and restore it to the original node or for other. Is this possible?
Oh, nice script. On a sidenote, it should mention somewhere that the entire /var/www folder gets deleted. I wasn't aware of that, lol.